{
  "cells": [
    {
      "cell_type": "markdown",
      "source": [
        "[![Roboflow Notebooks](https://media.roboflow.com/notebooks/template/bannertest2-2.png?ik-sdk-version=javascript-1.4.3&updatedAt=1672932710194)](https://github.com/roboflow/notebooks)\n",
        "\n",
        "# Fine-tune PaliGemma on Object Detection Dataset\n",
        "\n",
        "---\n",
        "\n",
        "[![GitHub](https://badges.aleen42.com/src/github.svg)](https://github.com/google-research/big_vision/blob/main/big_vision/configs/proj/paligemma/README.md)\n",
        "[![Roboflow](https://raw.githubusercontent.com/roboflow-ai/notebooks/main/assets/badges/roboflow-blogpost.svg)](https://blog.roboflow.com/how-to-fine-tune-paligemma/)\n",
        "[![YouTube](https://badges.aleen42.com/src/youtube.svg)](https://www.youtube.com/watch?v=OMBmVInx68M)\n",
        "\n",
        "PaliGemma is an open vision-language model (VLM) inspired by PaLI-3, built with\n",
        "open components, such as\n",
        "the [SigLIP vision model](https://colab.research.google.com/github/google-research/big_vision/blob/main/big_vision/configs/proj/image_text/SigLIP_demo.ipynb)\n",
        "and\n",
        "the [Gemma language model](https://ai.google.dev/gemma).\n",
        "PaliGemma is designed as a versatile model for transfer to a wide range of\n",
        "vision-language tasks such as image and short video caption, visual question\n",
        "answering, text reading, object detection and object segmentation. Together with\n",
        "the pretrained and transfer checkpoints at multiple resolutions, we provide a\n",
        "checkpoint transferred to a mixture of tasks that can be used for off-the-shelf\n",
        "exploration.\n",
        "\n",
        "This notebook is an extension of the [official notebook](https://colab.research.google.com/github/google-research/big_vision/blob/main/big_vision/configs/proj/paligemma/finetune_paligemma.ipynb) prepared by Google Research.\n",
        "\n",
        "![PaliGemma model](https://storage.cloud.google.com/com-roboflow-marketing/notebooks/examples/paligemma.png)\n",
        "\n",
        "This notebook shows how to fine-tune [PaliGemma](https://ai.google.dev/gemma/docs/paligemma) on a vision-language task with [JAX](https://jax.readthedocs.io/en/latest/index.html). *Fine-tuning* is a process that can improve your model's performance on specific tasks or help the model adhere to specific output requirements when instructions aren't sufficient and you have a set of examples that demonstrate the outputs you want. Gemma-based models like PaliGemma require fine-tuning to produce expected results.\n",
        "\n",
        "To make it runnable on a T4 colab runtime with 16GB HBM and 12GB RAM, we opt to only finetune the attention layers of the language model and freeze the other parameters."
      ],
      "metadata": {
        "id": "4LqvmtZPzyY1"
      }
    },
    {
      "cell_type": "markdown",
      "source": [
        "## Setup"
      ],
      "metadata": {
        "id": "lBp3Czz3GBmc"
      }
    },
    {
      "cell_type": "markdown",
      "source": [
        "### Get access to PaliGemma\n",
        "\n",
        "Before using PaliGemma for the first time, you must request access to the model through Kaggle by completing the following steps:\n",
        "\n",
        "1. Log in to [`Kaggle`](https://www.kaggle.com), or create a new Kaggle account if you don't already have one.\n",
        "1. Go to the [`PaliGemma Model Card`](https://www.kaggle.com/models/google/paligemma/) and click `Request Access`.\n",
        "1. Complete the consent form and accept the terms and conditions."
      ],
      "metadata": {
        "id": "4ohXT9pQFjZs"
      }
    },
    {
      "cell_type": "markdown",
      "source": [
        "### Configure your API keys\n",
        "\n",
        "To use PaliGemma, you need to provide your Kaggle username, Kaggle API key, and Roboflow API key. Follow these steps:\n",
        "\n",
        "- Open your [`Kaggle Settings`](https://www.kaggle.com/settings) page. Click `Create New Token`. This will download a `kaggle.json` file containing your API credentials.\n",
        "- Go to your [`Roboflow Settings`](https://app.roboflow.com/settings/api) page. Click `Copy`. This will place your private key in the clipboard.\n",
        "- In Colab, go to the left pane and click on `Secrets` (🔑).\n",
        "    - Store Kaggle Username under the name `KAGGLE_USERNAME`.\n",
        "    - Store Kaggle API Key under the name `KAGGLE_KEY`.\n",
        "    - Store Roboflow API Key under the name `ROBOFLOW_API_KEY`."
      ],
      "metadata": {
        "id": "ADTkh-2y_9Yv"
      }
    },
    {
      "cell_type": "markdown",
      "source": [
        "### Select the runtime\n",
        "\n",
        "Let's make sure that we have access to GPU. We can use `nvidia-smi` command to do that. In case of any problems navigate to `Edit` -> `Notebook settings` -> `Hardware accelerator`, set it to `T4 GPU`, and then click `Save`."
      ],
      "metadata": {
        "id": "4wyojKiG_hX9"
      }
    },
    {
      "cell_type": "code",
      "source": [
        "!nvidia-smi"
      ],
      "metadata": {
        "id": "O_8BLW6R_x-z",
        "outputId": "f2c36369-fd26-4b58-c93f-4ef2aebc1764",
        "colab": {
          "base_uri": "https://localhost:8080/"
        }
      },
      "execution_count": null,
      "outputs": [
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "Thu May 30 08:04:49 2024       \n",
            "+---------------------------------------------------------------------------------------+\n",
            "| NVIDIA-SMI 535.104.05             Driver Version: 535.104.05   CUDA Version: 12.2     |\n",
            "|-----------------------------------------+----------------------+----------------------+\n",
            "| GPU  Name                 Persistence-M | Bus-Id        Disp.A | Volatile Uncorr. ECC |\n",
            "| Fan  Temp   Perf          Pwr:Usage/Cap |         Memory-Usage | GPU-Util  Compute M. |\n",
            "|                                         |                      |               MIG M. |\n",
            "|=========================================+======================+======================|\n",
            "|   0  Tesla T4                       Off | 00000000:00:04.0 Off |                    0 |\n",
            "| N/A   67C    P8              10W /  70W |      0MiB / 15360MiB |      0%      Default |\n",
            "|                                         |                      |                  N/A |\n",
            "+-----------------------------------------+----------------------+----------------------+\n",
            "                                                                                         \n",
            "+---------------------------------------------------------------------------------------+\n",
            "| Processes:                                                                            |\n",
            "|  GPU   GI   CI        PID   Type   Process name                            GPU Memory |\n",
            "|        ID   ID                                                             Usage      |\n",
            "|=======================================================================================|\n",
            "|  No running processes found                                                           |\n",
            "+---------------------------------------------------------------------------------------+\n"
          ]
        }
      ]
    },
    {
      "cell_type": "markdown",
      "source": [
        "### Download dataset from Roboflow Universe\n",
        "\n",
        "To fine-tune PaliGemma, prepare your dataset in JSONL format. You can use Roboflow to easily convert any dataset into this format."
      ],
      "metadata": {
        "id": "FMlw3ru1YvLg"
      }
    },
    {
      "cell_type": "code",
      "source": [
        "!pip install -q roboflow\n",
        "!pip install -q git+https://github.com/roboflow/supervision.git"
      ],
      "metadata": {
        "id": "Wtvz4QZ9YuG8",
        "outputId": "7ca5c1c8-4aa1-442c-c942-5cdc15fc277a",
        "colab": {
          "base_uri": "https://localhost:8080/"
        }
      },
      "execution_count": null,
      "outputs": [
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "\u001b[2K     \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m75.5/75.5 kB\u001b[0m \u001b[31m1.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
            "\u001b[2K     \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m158.3/158.3 kB\u001b[0m \u001b[31m6.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
            "\u001b[2K     \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m178.7/178.7 kB\u001b[0m \u001b[31m15.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
            "\u001b[2K     \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m58.8/58.8 kB\u001b[0m \u001b[31m4.6 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
            "\u001b[2K     \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m49.1/49.1 MB\u001b[0m \u001b[31m30.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
            "\u001b[2K     \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m54.5/54.5 kB\u001b[0m \u001b[31m5.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
            "\u001b[?25h"
          ]
        }
      ]
    },
    {
      "cell_type": "code",
      "source": [
        "from google.colab import userdata\n",
        "from roboflow import Roboflow\n",
        "\n",
        "ROBOFLOW_API_KEY = userdata.get('ROBOFLOW_API_KEY')\n",
        "rf = Roboflow(api_key=ROBOFLOW_API_KEY)\n",
        "\n",
        "project = rf.workspace(\"roboflow-jvuqo\").project(\"number-ops-j1426\")\n",
        "version = project.version(1)\n",
        "dataset = version.download(\"paligemma\")"
      ],
      "metadata": {
        "id": "TGDFTYVnY4zn"
      },
      "execution_count": null,
      "outputs": []
    },
    {
      "cell_type": "code",
      "source": [
        "!head -n 5 {dataset.location}/dataset/_annotations.train.jsonl"
      ],
      "metadata": {
        "id": "WLhSenP5AtQe"
      },
      "execution_count": null,
      "outputs": []
    },
    {
      "cell_type": "code",
      "source": [
        "!head -n 5 {dataset.location}/dataset/_annotations.valid.jsonl"
      ],
      "metadata": {
        "id": "YwHY21ABA0WG"
      },
      "execution_count": null,
      "outputs": []
    },
    {
      "cell_type": "code",
      "source": [
        "import cv2\n",
        "import json\n",
        "import supervision as sv\n",
        "from typing import List\n",
        "\n",
        "def read_n_lines(file_path: str, n: int) -> List[str]:\n",
        "    with open(file_path, 'r') as file:\n",
        "        lines = [next(file).strip() for _ in range(n)]\n",
        "    return lines\n",
        "\n",
        "images = []\n",
        "lines = read_n_lines(f\"{dataset.location}/dataset/_annotations.train.jsonl\", 25)\n",
        "first = json.loads(lines[0])\n",
        "\n",
        "CLASSES = first.get('prefix').replace(\"detect \", \"\").split(\" ; \")\n",
        "\n",
        "for line in lines:\n",
        "    data = json.loads(line)\n",
        "    image = cv2.imread(f\"{dataset.location}/dataset/{data.get('image')}\")\n",
        "    (h, w, _) = image.shape\n",
        "    detections = sv.Detections.from_lmm(\n",
        "        lmm='paligemma',\n",
        "        result=data.get('suffix'),\n",
        "        resolution_wh=(w, h),\n",
        "        classes=CLASSES)\n",
        "\n",
        "    image = sv.BoundingBoxAnnotator(thickness=4).annotate(image, detections)\n",
        "    image = sv.LabelAnnotator(text_scale=2, text_thickness=4).annotate(image, detections)\n",
        "    images.append(image)\n",
        "\n",
        "sv.plot_images_grid(images, (5, 5))"
      ],
      "metadata": {
        "id": "6ihTTuTd747l"
      },
      "execution_count": null,
      "outputs": []
    },
    {
      "cell_type": "markdown",
      "source": [
        "### Fetch the `big_vision` repository and install related dependencies\n",
        "\n",
        "Download the `big_vision` repository to your Colab notebook from GitHub and install dependencies related to `big_vision` by running the following code."
      ],
      "metadata": {
        "id": "eg3sqaoPFS3W"
      }
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "DfxKb3F839Ks"
      },
      "outputs": [],
      "source": [
        "import os\n",
        "import sys\n",
        "\n",
        "# TPUs with\n",
        "if \"COLAB_TPU_ADDR\" in os.environ:\n",
        "  raise \"It seems you are using Colab with remote TPUs which is not supported.\"\n",
        "\n",
        "# Fetch big_vision repository if python doesn't know about it and install\n",
        "# dependencies needed for this notebook.\n",
        "if not os.path.exists(\"big_vision_repo\"):\n",
        "  !git clone --quiet --branch=main --depth=1 \\\n",
        "     https://github.com/google-research/big_vision big_vision_repo\n",
        "\n",
        "# Append big_vision code to python import path\n",
        "if \"big_vision_repo\" not in sys.path:\n",
        "  sys.path.append(\"big_vision_repo\")\n",
        "\n",
        "# Install missing dependencies. Assume jax~=0.4.25 with GPU available.\n",
        "!pip3 install -q \"overrides\" \"ml_collections\" \"einops~=0.7\" \"sentencepiece\"\n"
      ]
    },
    {
      "cell_type": "markdown",
      "source": [
        "### Set environment variables\n",
        "\n",
        "Set the environment variables for `KAGGLE_USERNAME` and `KAGGLE_KEY`."
      ],
      "metadata": {
        "id": "YU2fs7d0F1Fo"
      }
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "zGLIp1Cx3_CX"
      },
      "outputs": [],
      "source": [
        "import os\n",
        "from google.colab import userdata\n",
        "\n",
        "# Note: `userdata.get` is a Colab API. If you're not using Colab, set the env\n",
        "# vars as appropriate or make your credentials available in ~/.kaggle/kaggle.json\n",
        "\n",
        "os.environ[\"KAGGLE_USERNAME\"] = userdata.get('KAGGLE_USERNAME')\n",
        "os.environ[\"KAGGLE_KEY\"] = userdata.get('KAGGLE_KEY')"
      ]
    },
    {
      "cell_type": "markdown",
      "source": [
        "### Import JAX and other dependencies\n",
        "\n",
        "Import JAX and other dependencies required for PaliGemma, like TensorFlow and NumPy."
      ],
      "metadata": {
        "id": "zx3dj5NzG93I"
      }
    },
    {
      "cell_type": "code",
      "source": [
        "import base64\n",
        "import functools\n",
        "import html\n",
        "import io\n",
        "import os\n",
        "import warnings\n",
        "\n",
        "import jax\n",
        "import jax.numpy as jnp\n",
        "import numpy as np\n",
        "import ml_collections\n",
        "\n",
        "import tensorflow as tf\n",
        "import sentencepiece\n",
        "\n",
        "from IPython.core.display import display, HTML\n",
        "from PIL import Image\n",
        "from tqdm.notebook import tqdm\n",
        "\n",
        "# Import model definition from big_vision\n",
        "from big_vision.models.proj.paligemma import paligemma\n",
        "from big_vision.trainers.proj.paligemma import predict_fns\n",
        "\n",
        "# Import big vision utilities\n",
        "import big_vision.datasets.jsonl\n",
        "import big_vision.utils\n",
        "import big_vision.sharding\n",
        "\n",
        "# Don't let TF use the GPU or TPUs\n",
        "tf.config.set_visible_devices([], \"GPU\")\n",
        "tf.config.set_visible_devices([], \"TPU\")\n",
        "\n",
        "backend = jax.lib.xla_bridge.get_backend()\n",
        "print(f\"JAX version:  {jax.__version__}\")\n",
        "print(f\"JAX platform: {backend.platform}\")\n",
        "print(f\"JAX devices:  {jax.device_count()}\")"
      ],
      "metadata": {
        "id": "OlWELn2FHB22"
      },
      "execution_count": null,
      "outputs": []
    },
    {
      "cell_type": "markdown",
      "source": [
        "## Download and configure the model\n",
        "\n",
        "In this step, you'll download the model checkpoint and configure it so that you can fine-tune it later on. This step shows you how to move model parameters into TPU memory, which is useful for fine-tuning models on devices with limited resources."
      ],
      "metadata": {
        "id": "_PHhkFGuHMFF"
      }
    },
    {
      "cell_type": "markdown",
      "source": [
        "### Download the model checkpoint\n",
        "\n",
        "PaliGemma includes several model variations. For this tutorial, you'll use the base [JAX/FLAX PaliGemma 3B weight model](https://www.kaggle.com/models/google/paligemma/jax/paligemma-3b-pt-224).\n",
        "\n",
        "Download the `float16` version of the model checkpoint from Kaggle by running the following code. This process takes several minutes to complete."
      ],
      "metadata": {
        "id": "9wU_sHbGHQka"
      }
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "gQNOTfF24AV4"
      },
      "outputs": [],
      "source": [
        "import os\n",
        "import kagglehub\n",
        "\n",
        "MODEL_PATH = \"./pt_224_128.params.f16.npz\"\n",
        "if not os.path.exists(MODEL_PATH):\n",
        "    print(\"Downloading the checkpoint from Kaggle, this could take a few minutes....\")\n",
        "    # Note: kaggle archive contains the same checkpoint in multiple formats.\n",
        "    # Download only the float16 model.\n",
        "    MODEL_PATH = kagglehub.model_download('google/paligemma/jax/paligemma-3b-pt-224', 'paligemma-3b-pt-224.f16.npz')\n",
        "    print(f\"Model path: {MODEL_PATH}\")\n",
        "\n",
        "TOKENIZER_PATH = \"./paligemma_tokenizer.model\"\n",
        "if not os.path.exists(TOKENIZER_PATH):\n",
        "    print(\"Downloading the model tokenizer...\")\n",
        "    !gsutil cp gs://big_vision/paligemma_tokenizer.model {TOKENIZER_PATH}\n",
        "    print(f\"Tokenizer path: {TOKENIZER_PATH}\")"
      ]
    },
    {
      "cell_type": "markdown",
      "source": [
        "### Configure the model\n",
        "\n",
        "It's time to actually start configuring the model that you're going to use.\n",
        "\n",
        "For this notebook, you need to be able to fit your model onto a T4 GPU. Having a limited resource like space constraints means that you have to be mindful of how your model is configured.\n",
        "\n",
        "If you fine-tune every parameter, your model won't be able to run in the notebook environment. As a result, in this part of the notebook, you'll configure your model so that it has the ability to freeze some of the parameters, and only fine-tune the parameters that really need to be fine-tuned for the model to give you accurate results. In LLMs, parameters are said to be *frozen* when they are no longer actively being used to train the model.\n",
        "\n",
        "In order to configure your model, you need to:\n",
        "\n",
        "* Initialize the `model_config` as a [`FrozenConfigDict`](https://github.com/google/ml_collections/tree/master#frozenconfigdict) so that you can freeze some of the parameters and keep memory usage low\n",
        "* Initialize an instance of the PaliGemma `Model` class using the `model_config` as its configurations\n",
        "* Load the model parameters into RAM\n",
        "* Define a `decode` function to sample outputs from the model\n",
        "\n",
        "This code in this cell takes about a minute to run to completion."
      ],
      "metadata": {
        "id": "fnCT0G9sHxsX"
      }
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "1aghcULcEdtv"
      },
      "outputs": [],
      "source": [
        "# Define model\n",
        "model_config = ml_collections.FrozenConfigDict({\n",
        "    \"llm\": {\"vocab_size\": 257_152},\n",
        "    \"img\": {\"variant\": \"So400m/14\", \"pool_type\": \"none\", \"scan\": True, \"dtype_mm\": \"float16\"}\n",
        "})\n",
        "model = paligemma.Model(**model_config)\n",
        "tokenizer = sentencepiece.SentencePieceProcessor(TOKENIZER_PATH)\n",
        "\n",
        "# Load params - this can take up to 1 minute in T4 colabs.\n",
        "params = paligemma.load(None, MODEL_PATH, model_config)\n",
        "\n",
        "# Define `decode` function to sample outputs from the model.\n",
        "decode_fn = predict_fns.get_all(model)['decode']\n",
        "decode = functools.partial(decode_fn, devices=jax.devices(), eos_token=tokenizer.eos_id())"
      ]
    },
    {
      "cell_type": "markdown",
      "source": [
        "### Move model parameters into GPU/TPU memory\n",
        "\n",
        "Now you need to move the model parameters into GPU/TPU memory. First, shard the parameters across the available GPUs, then load the parameters. Here, you'll load the parameters sequentially. This process takes longer than loading them simultaneously, but it requires more RAM than you have available in this notebook.\n",
        "\n",
        "Finally, print out all of the parameters to see what type each individual parameter is cast to. Frozen parameters are kept as `float16`, while the trainable parameters are cast to `float32`. When you inspect the list, you'll see that most of the parameters have been frozen and are `float16`."
      ],
      "metadata": {
        "id": "jytrSroKIfLD"
      }
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "RWOdf_fw2SAO"
      },
      "outputs": [],
      "source": [
        "# Create a pytree mask of the trainable params.\n",
        "def is_trainable_param(name, param):  # pylint: disable=unused-argument\n",
        "  if name.startswith(\"llm/layers/attn/\"):  return True\n",
        "  if name.startswith(\"llm/\"):              return False\n",
        "  if name.startswith(\"img/\"):              return False\n",
        "  raise ValueError(f\"Unexpected param name {name}\")\n",
        "trainable_mask = big_vision.utils.tree_map_with_names(is_trainable_param, params)\n",
        "\n",
        "# If more than one device is available (e.g. multiple GPUs) the parameters can\n",
        "# be sharded across them to reduce HBM usage per device.\n",
        "mesh = jax.sharding.Mesh(jax.devices(), (\"data\"))\n",
        "\n",
        "data_sharding = jax.sharding.NamedSharding(\n",
        "    mesh, jax.sharding.PartitionSpec(\"data\"))\n",
        "\n",
        "params_sharding = big_vision.sharding.infer_sharding(\n",
        "    params, strategy=[('.*', 'fsdp(axis=\"data\")')], mesh=mesh)\n",
        "\n",
        "# Yes: Some donated buffers are not usable.\n",
        "warnings.filterwarnings(\n",
        "    \"ignore\", message=\"Some donated buffers were not usable\")\n",
        "\n",
        "@functools.partial(jax.jit, donate_argnums=(0,), static_argnums=(1,))\n",
        "def maybe_cast_to_f32(params, trainable):\n",
        "  return jax.tree.map(lambda p, m: p.astype(jnp.float32) if m else p,\n",
        "                      params, trainable)\n",
        "\n",
        "# Loading all params in simultaneous - albeit much faster and more succinct -\n",
        "# requires more RAM than the T4 colab runtimes have by default.\n",
        "# Instead we do it param by param.\n",
        "params, treedef = jax.tree.flatten(params)\n",
        "sharding_leaves = jax.tree.leaves(params_sharding)\n",
        "trainable_leaves = jax.tree.leaves(trainable_mask)\n",
        "for idx, (sharding, trainable) in enumerate(zip(sharding_leaves, trainable_leaves)):\n",
        "  params[idx] = big_vision.utils.reshard(params[idx], sharding)\n",
        "  params[idx] = maybe_cast_to_f32(params[idx], trainable)\n",
        "  params[idx].block_until_ready()\n",
        "params = jax.tree.unflatten(treedef, params)\n",
        "\n",
        "# Print params to show what the model is made of.\n",
        "def parameter_overview(params):\n",
        "  for path, arr in big_vision.utils.tree_flatten_with_names(params)[0]:\n",
        "    print(f\"{path:80s} {str(arr.shape):22s} {arr.dtype}\")\n",
        "\n",
        "print(\" == Model params == \")\n",
        "parameter_overview(params)"
      ]
    },
    {
      "cell_type": "markdown",
      "source": [
        "## Prepare to tune the model\n",
        "\n",
        "Now that your model is configured, you can tune it. In this step, you'll create your model's inputs as well as the training and validation iterators, view the training examples, and define the training and validation loops."
      ],
      "metadata": {
        "id": "Y1sfijh_Ix09"
      }
    },
    {
      "cell_type": "markdown",
      "source": [
        "### Create model inputs\n",
        "\n",
        "The model checkpoint you're using has already been trained on images of various aspect ratios that have been resized to 224x224 pixels, and to handle tokenized texts.\n",
        "\n",
        "The code below defines three functions that you'll use in the next step create the model's inputs:\n",
        "\n",
        "* **`preprocess_image`:** Normalizes the image data. In this case, pre-processing converts the passed-in image to greyscale, removes the alpha layer, and resizes the passed-in image to the size required by the model for image inputs (224x224 pixels).\n",
        "* **`preprocess_tokens`:** Splits the tokens up and adds flags to mark whether a token is a prefix or suffix token. These flags will be used later on in the code, during the training step and the evaluation loop.\n",
        "* **`postprocess_tokens`:** Removes any tokens left at and/or after the end-of-sequence (EOS) token and returns the remaining decoded tokens.\n"
      ],
      "metadata": {
        "id": "6hNkSJwJI138"
      }
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "8SRW0NuU4UcW"
      },
      "outputs": [],
      "source": [
        "def preprocess_image(image, size=224):\n",
        "  # Model has been trained to handle images of different aspects ratios\n",
        "  # resized to 224x224 in the range [-1, 1]. Bilinear and antialias resize\n",
        "  # options are helpful to improve quality in some tasks.\n",
        "  image = np.asarray(image)\n",
        "  if image.ndim == 2:  # Convert image without last channel into greyscale.\n",
        "    image = np.stack((image,)*3, axis=-1)\n",
        "  image = image[..., :3]  # Remove alpha layer.\n",
        "  assert image.shape[-1] == 3\n",
        "\n",
        "  image = tf.constant(image)\n",
        "  image = tf.image.resize(image, (size, size), method='bilinear', antialias=True)\n",
        "  return image.numpy() / 127.5 - 1.0  # [0, 255]->[-1,1]\n",
        "\n",
        "def preprocess_tokens(prefix, suffix=None, seqlen=None):\n",
        "  # Model has been trained to handle tokenized text composed of a prefix with\n",
        "  # full attention and a suffix with causal attention.\n",
        "  separator = \"\\n\"\n",
        "  tokens = tokenizer.encode(prefix, add_bos=True) + tokenizer.encode(separator)\n",
        "  mask_ar = [0] * len(tokens)    # 0 to use full attention for prefix.\n",
        "  mask_loss = [0] * len(tokens)  # 0 to not use prefix tokens in the loss.\n",
        "\n",
        "  if suffix:\n",
        "    suffix = tokenizer.encode(suffix, add_eos=True)\n",
        "    tokens += suffix\n",
        "    mask_ar += [1] * len(suffix)    # 1 to use causal attention for suffix.\n",
        "    mask_loss += [1] * len(suffix)  # 1 to use suffix tokens in the loss.\n",
        "\n",
        "  mask_input = [1] * len(tokens)    # 1 if it's a token, 0 if padding.\n",
        "  if seqlen:\n",
        "    padding = [0] * max(0, seqlen - len(tokens))\n",
        "    tokens = tokens[:seqlen] + padding\n",
        "    mask_ar = mask_ar[:seqlen] + padding\n",
        "    mask_loss = mask_loss[:seqlen] + padding\n",
        "    mask_input = mask_input[:seqlen] + padding\n",
        "\n",
        "  return jax.tree.map(np.array, (tokens, mask_ar, mask_loss, mask_input))\n",
        "\n",
        "def postprocess_tokens(tokens):\n",
        "  tokens = tokens.tolist()  # np.array to list[int]\n",
        "  try:  # Remove tokens at and after EOS if any.\n",
        "    eos_pos = tokens.index(tokenizer.eos_id())\n",
        "    tokens = tokens[:eos_pos]\n",
        "  except ValueError:\n",
        "    pass\n",
        "  return tokenizer.decode(tokens)"
      ]
    },
    {
      "cell_type": "markdown",
      "source": [
        "### Create the training and validation iterators\n",
        "\n",
        "Create two iterators:\n",
        "\n",
        "*   A **training iterator** to allow the training process to go through the data in chunks rather than processing it all at once. This allows you to do some data pre-processing before use.\n",
        "*   A **validation iterator** that allows the training process to iterate over the validation dataset to see how well the tuned model aligned with the provided results."
      ],
      "metadata": {
        "id": "h4Lul8c3JDBQ"
      }
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "whzWOojGOtzi"
      },
      "outputs": [],
      "source": [
        "SEQLEN = 128\n",
        "\n",
        "train_dataset = big_vision.datasets.jsonl.DataSource(\n",
        "    os.path.join(dataset.location, \"dataset/_annotations.train.jsonl\"),\n",
        "    fopen_keys={\"image\": f\"{dataset.location}/dataset\"})\n",
        "\n",
        "val_dataset = big_vision.datasets.jsonl.DataSource(\n",
        "    os.path.join(dataset.location, \"dataset/_annotations.valid.jsonl\"),\n",
        "    fopen_keys={\"image\": f\"{dataset.location}/dataset\"})\n",
        "\n",
        "\n",
        "def train_data_iterator():\n",
        "  \"\"\"Never ending iterator over training examples.\"\"\"\n",
        "  # Shuffle examples and repeat so one can train for many epochs.\n",
        "  dataset = train_dataset.get_tfdata().shuffle(1_000).repeat()\n",
        "  for example in dataset.as_numpy_iterator():\n",
        "    image = Image.open(io.BytesIO(example[\"image\"]))\n",
        "    image = preprocess_image(image)\n",
        "\n",
        "    prefix = example[\"prefix\"].decode().lower()\n",
        "    suffix = example[\"suffix\"].decode().lower()\n",
        "    tokens, mask_ar, mask_loss, _ = preprocess_tokens(prefix, suffix, SEQLEN)\n",
        "    label, _, _, _ = preprocess_tokens(suffix, seqlen=SEQLEN)\n",
        "\n",
        "    yield {\n",
        "        \"image\": np.asarray(image),\n",
        "        \"text\": np.asarray(tokens),\n",
        "        \"label\": np.asarray(label),\n",
        "        \"mask_ar\": np.asarray(mask_ar),\n",
        "        \"mask_loss\": np.asarray(mask_loss),\n",
        "    }\n",
        "\n",
        "\n",
        "def validation_data_iterator():\n",
        "  \"\"\"Single iterator over validation examples.\"\"\"\n",
        "  for example in val_dataset.get_tfdata(ordered=True).as_numpy_iterator():\n",
        "    image = Image.open(io.BytesIO(example[\"image\"]))\n",
        "    image = preprocess_image(image)\n",
        "\n",
        "    prefix = example[\"prefix\"].decode().lower()\n",
        "    suffix = example[\"suffix\"].decode().lower()\n",
        "    tokens, mask_ar, _, mask_input = preprocess_tokens(prefix, seqlen=SEQLEN)\n",
        "    label, _, _, _ = preprocess_tokens(suffix, seqlen=SEQLEN)\n",
        "\n",
        "    yield {\n",
        "        \"image\": np.asarray(image),\n",
        "        \"text\": np.asarray(tokens),\n",
        "        \"label\": np.asarray(label),\n",
        "        \"mask_ar\": np.asarray(mask_ar),\n",
        "        \"mask_input\": np.asarray(mask_input),\n",
        "    }\n"
      ]
    },
    {
      "cell_type": "markdown",
      "source": [
        "### View training examples\n",
        "\n",
        "In this notebook, the training data contains 90 images that are paired with long descriptions of what's depicted in the image.\n",
        "\n",
        "**Note:** Normal training data sets that are meant to be used for practical use cases should contain more images, but this notebook limits the number of data points so that you can train the model in a reasonable amount of time for an example.\n",
        "\n",
        "The code below prints a random selection of images with their descriptions from the training data set so that you can see what the images and descriptions your model is trained on looks like. Each image is displayed in as a 128x128 pixel JPEG, with the description printed next to the image to the right."
      ],
      "metadata": {
        "id": "ml_wkTbJJj-N"
      }
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 167
        },
        "id": "BzJfb5t0nsLq",
        "outputId": "65726c5b-2d7e-4fb8-b5f4-5183978f8118"
      },
      "outputs": [
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "Training examples\n"
          ]
        },
        {
          "output_type": "display_data",
          "data": {
            "text/plain": [
              "<IPython.core.display.HTML object>"
            ],
            "text/html": [
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0204&gt;&lt;loc0178&gt;&lt;loc0972&gt;&lt;loc0720&gt; mult</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0128&gt;&lt;loc0284&gt;&lt;loc0752&gt;&lt;loc0751&gt; 3</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0277&gt;&lt;loc0356&gt;&lt;loc0685&gt;&lt;loc0612&gt; 6</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0128&gt;&lt;loc0271&gt;&lt;loc0765&gt;&lt;loc0736&gt; 4</p>\n",
              "</div>\n"
            ]
          },
          "metadata": {}
        }
      ],
      "source": [
        "def split_and_keep_second_part(s):\n",
        "    parts = s.split('\\n', 1)\n",
        "    if len(parts) > 1:\n",
        "        return parts[1]\n",
        "    return s\n",
        "\n",
        "def render_inline(image, resize=(128, 128)):\n",
        "    \"\"\"Convert image into inline html.\"\"\"\n",
        "    image = Image.fromarray(image)\n",
        "    image.resize(resize)\n",
        "    with io.BytesIO() as buffer:\n",
        "        image.save(buffer, format='jpeg')\n",
        "        image_b64 = str(base64.b64encode(buffer.getvalue()), \"utf-8\")\n",
        "        return f\"data:image/jpeg;base64,{image_b64}\"\n",
        "\n",
        "def render_example(image, caption):\n",
        "    image = ((image + 1)/2 * 255).astype(np.uint8)  # [-1,1] -> [0, 255]\n",
        "    h, w, _ = image.shape\n",
        "    try:\n",
        "        detections = sv.Detections.from_lmm(\n",
        "            lmm='paligemma',\n",
        "            result=caption,\n",
        "            resolution_wh=(w, h),\n",
        "            classes=CLASSES)\n",
        "        image = sv.BoundingBoxAnnotator().annotate(image, detections)\n",
        "        image = sv.LabelAnnotator().annotate(image, detections)\n",
        "    except:\n",
        "        print(caption)\n",
        "    return f\"\"\"\n",
        "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
        "    <img style=\"width:128px; height:128px;\" src=\"{render_inline(image, resize=(64,64))}\" />\n",
        "    <p style=\"width:256px; margin:10px; font-size:small;\">{html.escape(caption)}</p>\n",
        "</div>\n",
        "\"\"\"\n",
        "\n",
        "html_out = \"\"\n",
        "for idx, example in zip(range(4), train_data_iterator()):\n",
        "    caption = postprocess_tokens(example[\"text\"])  # detokenize model input.\n",
        "    caption = split_and_keep_second_part(caption)\n",
        "    html_out += render_example(example[\"image\"], caption)\n",
        "\n",
        "print(\"Training examples\")\n",
        "display(HTML(html_out))"
      ]
    },
    {
      "cell_type": "markdown",
      "source": [
        "### Define the training and evaluation loops\n",
        "\n",
        "Define the training loop to train the model on the provided dataset, and the evaluation loop to look at all of the examples in the validation dataset and make its predictions.\n",
        "\n",
        "#### Defining the training loop\n",
        "\n",
        "The `update_fn` function defines the training step. During the training step, the loss per example is calculated and stochastic gradient descent (SGD) is applied to the trainable parameters.\n",
        "\n",
        "Recall that earlier in the notebook, you included flags in the `preprocess_tokens` function that included `mask_loss`. You'll use the `mask_loss` flag here to exclude prefix and padded tokens from the loss. Without it, the loss calculation will be skewed. You also need to normalize each example, since each of them has a different number of tokens. After the prefix and padded tokens have been excluded and the examples have been normalized, you can calculate the loss per example.\n",
        "\n",
        "The training step also includes a function to apply an SGD to optimize the training.\n",
        "\n",
        "#### Defining the evaluation loop\n",
        "\n",
        "The `make_predictions` function is your evaluation loop. The evaluation loop is fairly straight forward with one notable change. If you recall from the beginning of the notebook, you only have 90 examples in your training data set. This is a very small amount of training examples, and your model ends up not having enough examples for the batch size when you run the training. This means that in the evaluation loop, you need to pad the batch by repeating examples.\n",
        "\n",
        "To make sure that your evaluation loop only counts actual examples and not the padded examples, you have to apply a mask to the padded examples that excludes them from the output."
      ],
      "metadata": {
        "id": "hKFJ9rbLKoTa"
      }
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "dwUV_imW3WQJ"
      },
      "outputs": [],
      "source": [
        "# The main update_fn using a simple stochastic gradient descent (SGD).\n",
        "@functools.partial(jax.jit, donate_argnums=(0,))\n",
        "def update_fn(params, batch, learning_rate):\n",
        "  imgs, txts, mask_ar = batch[\"image\"], batch[\"text\"], batch[\"mask_ar\"]\n",
        "\n",
        "  def loss_fn(params):\n",
        "    text_logits, _ = model.apply({\"params\": params}, imgs, txts[:, :-1], mask_ar[:, :-1], train=True)\n",
        "    logp = jax.nn.log_softmax(text_logits, axis=-1)\n",
        "\n",
        "    # The model takes as input txts[:, :-1] but the loss is defined as predicting\n",
        "    # next tokens txts[:, 1:]. Additionally, mask_loss[:, 1:] indicates which tokens\n",
        "    # are part of the loss (e.g. prefix and padded tokens are not included).\n",
        "    mask_loss = batch[\"mask_loss\"][:, 1:]\n",
        "    targets = jax.nn.one_hot(txts[:, 1:], text_logits.shape[-1])\n",
        "\n",
        "    # Compute the loss per example. i.e. the mean of per token pplx.\n",
        "    # Since each example has a different number of tokens we normalize it.\n",
        "    token_pplx = jnp.sum(logp * targets, axis=-1)  # sum across vocab_size.\n",
        "    example_loss = -jnp.sum(token_pplx * mask_loss, axis=-1)  # sum across seq_len.\n",
        "    example_loss /= jnp.clip(jnp.sum(mask_loss, -1), 1)  # weight by num of tokens.\n",
        "\n",
        "    # batch_loss: mean of per example loss.\n",
        "    return jnp.mean(example_loss)\n",
        "\n",
        "  loss, grads = jax.value_and_grad(loss_fn)(params)\n",
        "\n",
        "  # Apply gradients to trainable params using SGD.\n",
        "  def apply_grad(param, gradient, trainable):\n",
        "    if not trainable: return param\n",
        "    return param - learning_rate * gradient\n",
        "\n",
        "  params = jax.tree_util.tree_map(apply_grad, params, grads, trainable_mask)\n",
        "\n",
        "  return params, loss\n",
        "\n",
        "# Evaluation/inference loop.\n",
        "def make_predictions(data_iterator, *, num_examples=None,\n",
        "                     batch_size=4, seqlen=SEQLEN, sampler=\"greedy\"):\n",
        "  outputs = []\n",
        "  while True:\n",
        "    # Construct a list of examples in the batch.\n",
        "    examples = []\n",
        "    try:\n",
        "      for _ in range(batch_size):\n",
        "        examples.append(next(data_iterator))\n",
        "        examples[-1][\"_mask\"] = np.array(True)  # Indicates true example.\n",
        "    except StopIteration:\n",
        "      if len(examples) == 0:\n",
        "        return outputs\n",
        "\n",
        "    # Not enough examples to complete a batch. Pad by repeating last example.\n",
        "    while len(examples) % batch_size:\n",
        "      examples.append(dict(examples[-1]))\n",
        "      examples[-1][\"_mask\"] = np.array(False)  # Indicates padding example.\n",
        "\n",
        "    # Convert list of examples into a dict of np.arrays and load onto devices.\n",
        "    batch = jax.tree.map(lambda *x: np.stack(x), *examples)\n",
        "    batch = big_vision.utils.reshard(batch, data_sharding)\n",
        "\n",
        "    # Make model predictions\n",
        "    tokens = decode({\"params\": params}, batch=batch,\n",
        "                    max_decode_len=seqlen, sampler=sampler)\n",
        "\n",
        "    # Fetch model predictions to device and detokenize.\n",
        "    tokens, mask = jax.device_get((tokens, batch[\"_mask\"]))\n",
        "    tokens = tokens[mask]  # remove padding examples.\n",
        "    labels = [postprocess_tokens(e[\"label\"]) for e in examples]\n",
        "    responses = [postprocess_tokens(t) for t in tokens]\n",
        "\n",
        "    # Append to html output.\n",
        "    for example, label, response in zip(examples, labels, responses):\n",
        "      outputs.append((example[\"image\"], label, response))\n",
        "      if num_examples and len(outputs) >= num_examples:\n",
        "        return outputs"
      ]
    },
    {
      "cell_type": "code",
      "source": [
        "html_out = \"\"\n",
        "for image, _, caption in make_predictions(validation_data_iterator(), num_examples=4, batch_size=4):\n",
        "  html_out += render_example(image, caption)\n",
        "display(HTML(html_out))"
      ],
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 375
        },
        "id": "GCXYnIdm4ILQ",
        "outputId": "51ed1a0f-6580-4728-dfea-606310bc6056"
      },
      "execution_count": null,
      "outputs": [
        {
          "output_type": "display_data",
          "data": {
            "text/plain": [
              "<IPython.core.display.HTML object>"
            ],
            "text/html": [
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0046&gt;&lt;loc0233&gt;&lt;loc0896&gt;&lt;loc0921&gt; 1 ; &lt;loc0046&gt;&lt;loc0233&gt;&lt;loc0896&gt;&lt;loc0921&gt; 3 ; &lt;loc0046&gt;&lt;loc0505&gt;&lt;loc0896&gt;&lt;loc0696&gt; 7 ; &lt;loc0046&gt;&lt;loc0603&gt;&lt;loc0432&gt;&lt;loc0687&gt; 5 ; &lt;loc0046&gt;&lt;loc0233&gt;&lt;loc0536&gt;&lt;loc0912&gt; 6 ; &lt;loc0046&gt;&lt;loc0233&gt;&lt;loc0896&gt;&lt;loc0921&gt; 4 ; &lt;loc0046&gt;&lt;loc0233&gt;&lt;loc0533&gt;&lt;loc0912&gt; 9 ; &lt;loc0046&gt;&lt;loc0233&gt;&lt;loc0896&gt;&lt;loc0912&gt; 2 ; &lt;loc0046&gt;&lt;loc0233&gt;&lt;loc0531&gt;&lt;loc0912&gt; 8 ; &lt;loc0045&gt;&lt;loc0233&gt;&lt;loc0896&gt;&lt;loc0912&gt; 1 ; &lt;loc0045&gt;&lt;loc0000&gt;&lt;loc1023&gt;&lt;loc1013&gt; mult ; &lt;loc0045&gt;&lt;loc0541&gt;&lt;loc0896&gt;&lt;loc0696&gt; 7 ; &lt;loc0045&gt;&lt;loc0567&gt;&lt;loc0456&gt;&lt;loc0690&gt; minus ; &lt;loc0413&gt;&lt;loc0233&gt;&lt;loc0538&gt;&lt;loc0912&gt; div ; &lt;loc0036&gt;&lt;loc0000&gt;&lt;loc1023&gt;&lt;loc1015&gt; 1 ; &lt;loc0036&gt;&lt;loc0000&gt;&lt;loc1023&gt;&lt;loc1023&gt; 1 ; &lt;loc0036&gt;&lt;loc0000&gt;&lt;loc1023&gt;</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0217&gt;&lt;loc0261&gt;&lt;loc0957&gt;&lt;loc0819&gt; 4 ; &lt;loc0207&gt;&lt;loc0252&gt;&lt;loc0954&gt;&lt;loc0826&gt; 7 ; &lt;loc0207&gt;&lt;loc0246&gt;&lt;loc0949&gt;&lt;loc0826&gt; 5 ; &lt;loc0207&gt;&lt;loc0246&gt;&lt;loc0949&gt;&lt;loc0826&gt; 1 ; &lt;loc0207&gt;&lt;loc0421&gt;&lt;loc0778&gt;&lt;loc0703&gt; minus ; &lt;loc0000&gt;&lt;loc0000&gt;&lt;loc1023&gt;&lt;loc1017&gt; mult ; &lt;loc0207&gt;&lt;loc0246&gt;&lt;loc0949&gt;&lt;loc0818&gt; 3 ; &lt;loc0207&gt;&lt;loc0246&gt;&lt;loc0944&gt;&lt;loc0809&gt; 6 ; &lt;loc0754&gt;&lt;loc0264&gt;&lt;loc0936&gt;&lt;loc0513&gt; 8 ; &lt;loc0754&gt;&lt;loc0261&gt;&lt;loc0944&gt;&lt;loc0813&gt; 4 ; &lt;loc0754&gt;&lt;loc0261&gt;&lt;loc0936&gt;&lt;loc0513&gt; 5 ; &lt;loc0754&gt;&lt;loc0261&gt;&lt;loc0944&gt;&lt;loc0813&gt; 7 ; &lt;loc0754&gt;&lt;loc0261&gt;&lt;loc0936&gt;&lt;loc0813&gt; 9 ; &lt;loc0747&gt;&lt;loc0496&gt;&lt;loc0956&gt;&lt;loc0817&gt; div ; &lt;loc0000&gt;&lt;loc0000&gt;&lt;loc1023&gt;&lt;loc1017&gt; 1 ; &lt;loc0000&gt;&lt;loc0000&gt;&lt;loc1023&gt;&lt;loc1023&gt; 2 ; &lt;loc0000&gt;&lt;loc0000&gt;&lt;loc1023&gt;</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0116&gt;&lt;loc0230&gt;&lt;loc0920&gt;&lt;loc0634&gt; 1 ; &lt;loc0116&gt;&lt;loc0230&gt;&lt;loc0531&gt;&lt;loc0549&gt; 7 ; &lt;loc0116&gt;&lt;loc0230&gt;&lt;loc0920&gt;&lt;loc0629&gt; 3 ; &lt;loc0116&gt;&lt;loc0230&gt;&lt;loc0531&gt;&lt;loc0438&gt; 6 ; &lt;loc0116&gt;&lt;loc0222&gt;&lt;loc0925&gt;&lt;loc0629&gt; 4 ; &lt;loc0116&gt;&lt;loc0222&gt;&lt;loc0925&gt;&lt;loc0629&gt; 5 ; &lt;loc0116&gt;&lt;loc0222&gt;&lt;loc0541&gt;&lt;loc0553&gt; 8 ; &lt;loc0000&gt;&lt;loc0000&gt;&lt;loc1018&gt;&lt;loc1018&gt; mult ; &lt;loc0116&gt;&lt;loc0222&gt;&lt;loc0925&gt;&lt;loc0629&gt; 2 ; &lt;loc0116&gt;&lt;loc0230&gt;&lt;loc0541&gt;&lt;loc0557&gt; minus ; &lt;loc0866&gt;&lt;loc0404&gt;&lt;loc0926&gt;&lt;loc0658&gt; plus ; &lt;loc0168&gt;&lt;loc0505&gt;&lt;loc0906&gt;&lt;loc0580&gt; 4 ; &lt;loc0168&gt;&lt;loc0505&gt;&lt;loc0902&gt;&lt;loc0575&gt; 9 ; &lt;loc0866&gt;&lt;loc0403&gt;&lt;loc0922&gt;&lt;loc0658&gt; div ; &lt;loc0116&gt;&lt;loc0230&gt;&lt;loc0925&gt;&lt;loc0635&gt; 1 ; &lt;loc0116&gt;&lt;loc0230&gt;&lt;loc0925&gt;&lt;loc0626&gt; 4 ; &lt;loc0116&gt;&lt;loc0230&gt;&lt;loc0541&gt;&lt;loc0549&gt;</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0143&gt;&lt;loc0374&gt;&lt;loc0658&gt;&lt;loc0442&gt; 4 ; &lt;loc0143&gt;&lt;loc0374&gt;&lt;loc0658&gt;&lt;loc0441&gt; 5 ; &lt;loc0140&gt;&lt;loc0366&gt;&lt;loc0667&gt;&lt;loc0442&gt; 6 ; &lt;loc0000&gt;&lt;loc0000&gt;&lt;loc1023&gt;&lt;loc1017&gt; mult ; &lt;loc0140&gt;&lt;loc0366&gt;&lt;loc0667&gt;&lt;loc0442&gt; 7 ; &lt;loc0134&gt;&lt;loc0365&gt;&lt;loc0667&gt;&lt;loc0442&gt; 8 ; &lt;loc0000&gt;&lt;loc0000&gt;&lt;loc1023&gt;&lt;loc1017&gt; 3 ; &lt;loc0000&gt;&lt;loc0000&gt;&lt;loc1023&gt;&lt;loc1023&gt; 1 ; &lt;loc0000&gt;&lt;loc0000&gt;&lt;loc1023&gt;&lt;loc1023&gt; eq ; &lt;loc0000&gt;&lt;loc0000&gt;&lt;loc1023&gt;&lt;loc1023&gt; 2 ; &lt;loc0000&gt;&lt;loc0000&gt;&lt;loc0693&gt;&lt;loc1023&gt; 4 ; &lt;loc0000&gt;&lt;loc0000&gt;&lt;loc1023&gt;&lt;loc1023&gt; 9 ; &lt;loc0000&gt;&lt;loc0000&gt;&lt;loc1023&gt;&lt;loc1023&gt; minus ; &lt;loc0000&gt;&lt;loc0000&gt;&lt;loc1023&gt;&lt;loc1023&gt; 10 ; &lt;loc0000&gt;&lt;loc0356&gt;&lt;loc0703&gt;&lt;loc0456&gt; 1 ; &lt;loc0000&gt;&lt;loc0000&gt;&lt;loc1023&gt;&lt;loc1023&gt; 3 ; &lt;loc0000&gt;&lt;loc0000&gt;</p>\n",
              "</div>\n"
            ]
          },
          "metadata": {}
        }
      ]
    },
    {
      "cell_type": "markdown",
      "source": [
        "## Tune the model\n",
        "\n",
        "Now that you've set everything up and taken a look at the training data, it's time to finally tune the model. The code below runs the training loop for the model for 64 steps and prints the learning rate (`lr` in the printed output) and loss rate for each step.\n",
        "\n",
        "Every 16 steps, the model prints what its predictions are at that step in the training. This code prints out predictions for the same set of images so that you can see the model's ability to predict descriptions improve over time.\n",
        "\n",
        "At earlier steps in the training, there's likely issues with the descriptions, such as repeated sentences as the model gets stuck in its predictive loop or unfinished sentences. The model's predictions become steadily more accurate as training progresses. By step 64, the model's predictions should closely resemble the descriptions provided by the training data.\n",
        "\n",
        "This process takes around 15 minutes to complete on T4 TPUs."
      ],
      "metadata": {
        "id": "fNigSP99MJFe"
      }
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "067wj_6bZAG3",
        "outputId": "8671008b-9a90-4a36-f297-0facd27185b4",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 1000
        }
      },
      "outputs": [
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "step:  1/64   lr: 0.00083   loss: 5.6622\n",
            "step:  2/64   lr: 0.00167   loss: 3.8568\n",
            "step:  3/64   lr: 0.00250   loss: 2.8670\n",
            "step:  4/64   lr: 0.00333   loss: 2.9897\n",
            "step:  5/64   lr: 0.00417   loss: 4.1907\n",
            "step:  6/64   lr: 0.00500   loss: 3.3373\n",
            "step:  7/64   lr: 0.00500   loss: 2.8551\n",
            "step:  8/64   lr: 0.00499   loss: 2.9553\n",
            "Model predictions at step 8\n"
          ]
        },
        {
          "output_type": "display_data",
          "data": {
            "text/plain": [
              "<IPython.core.display.HTML object>"
            ],
            "text/html": [
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0039&gt;&lt;loc0238&gt;&lt;loc0912&gt;&lt;loc0921&gt; mult</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0219&gt;&lt;loc0264&gt;&lt;loc0968&gt;&lt;loc0826&gt; mult</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0116&gt;&lt;loc0230&gt;&lt;loc0936&gt;&lt;loc0646&gt; mult</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0137&gt;&lt;loc0377&gt;&lt;loc0694&gt;&lt;loc0452&gt; 9</p>\n",
              "</div>\n"
            ]
          },
          "metadata": {}
        },
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "step:  9/64   lr: 0.00497   loss: 2.9513\n",
            "step: 10/64   lr: 0.00494   loss: 3.1961\n",
            "step: 11/64   lr: 0.00491   loss: 3.0491\n",
            "step: 12/64   lr: 0.00487   loss: 2.8612\n",
            "step: 13/64   lr: 0.00483   loss: 2.5327\n",
            "step: 14/64   lr: 0.00478   loss: 2.4965\n",
            "step: 15/64   lr: 0.00472   loss: 2.9966\n",
            "step: 16/64   lr: 0.00465   loss: 2.9704\n",
            "Model predictions at step 16\n"
          ]
        },
        {
          "output_type": "display_data",
          "data": {
            "text/plain": [
              "<IPython.core.display.HTML object>"
            ],
            "text/html": [
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0046&gt;&lt;loc0216&gt;&lt;loc0887&gt;&lt;loc0887&gt; div</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0236&gt;&lt;loc0235&gt;&lt;loc0949&gt;&lt;loc0789&gt; 2</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0123&gt;&lt;loc0216&gt;&lt;loc0912&gt;&lt;loc0617&gt; 4</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0144&gt;&lt;loc0363&gt;&lt;loc0653&gt;&lt;loc0432&gt; 1</p>\n",
              "</div>\n"
            ]
          },
          "metadata": {}
        },
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "step: 17/64   lr: 0.00458   loss: 2.8967\n",
            "step: 18/64   lr: 0.00451   loss: 2.5452\n",
            "step: 19/64   lr: 0.00442   loss: 2.5070\n",
            "step: 20/64   lr: 0.00434   loss: 2.5249\n",
            "step: 21/64   lr: 0.00424   loss: 2.3752\n",
            "step: 22/64   lr: 0.00415   loss: 2.6018\n",
            "step: 23/64   lr: 0.00404   loss: 2.5676\n",
            "step: 24/64   lr: 0.00394   loss: 2.3139\n",
            "Model predictions at step 24\n"
          ]
        },
        {
          "output_type": "display_data",
          "data": {
            "text/plain": [
              "<IPython.core.display.HTML object>"
            ],
            "text/html": [
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0046&gt;&lt;loc0223&gt;&lt;loc0906&gt;&lt;loc0919&gt; plus</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0219&gt;&lt;loc0246&gt;&lt;loc0961&gt;&lt;loc0826&gt; 2</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0116&gt;&lt;loc0216&gt;&lt;loc0928&gt;&lt;loc0647&gt; 4</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0140&gt;&lt;loc0366&gt;&lt;loc0667&gt;&lt;loc0452&gt; 1</p>\n",
              "</div>\n"
            ]
          },
          "metadata": {}
        },
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "step: 25/64   lr: 0.00383   loss: 2.2908\n",
            "step: 26/64   lr: 0.00371   loss: 2.1218\n",
            "step: 27/64   lr: 0.00359   loss: 2.4826\n",
            "step: 28/64   lr: 0.00347   loss: 2.5923\n",
            "step: 29/64   lr: 0.00335   loss: 2.7243\n",
            "step: 30/64   lr: 0.00322   loss: 2.4624\n",
            "step: 31/64   lr: 0.00309   loss: 2.4676\n",
            "step: 32/64   lr: 0.00296   loss: 2.2581\n",
            "Model predictions at step 32\n"
          ]
        },
        {
          "output_type": "display_data",
          "data": {
            "text/plain": [
              "<IPython.core.display.HTML object>"
            ],
            "text/html": [
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0036&gt;&lt;loc0230&gt;&lt;loc0888&gt;&lt;loc0912&gt; plus</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0211&gt;&lt;loc0251&gt;&lt;loc0961&gt;&lt;loc0826&gt; 2</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0112&gt;&lt;loc0222&gt;&lt;loc0919&gt;&lt;loc0662&gt; 4</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0132&gt;&lt;loc0366&gt;&lt;loc0662&gt;&lt;loc0447&gt; 1</p>\n",
              "</div>\n"
            ]
          },
          "metadata": {}
        },
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "step: 33/64   lr: 0.00283   loss: 2.2747\n",
            "step: 34/64   lr: 0.00270   loss: 2.4034\n",
            "step: 35/64   lr: 0.00257   loss: 2.1800\n",
            "step: 36/64   lr: 0.00243   loss: 2.3051\n",
            "step: 37/64   lr: 0.00230   loss: 2.2851\n",
            "step: 38/64   lr: 0.00217   loss: 2.1452\n",
            "step: 39/64   lr: 0.00204   loss: 2.3099\n",
            "step: 40/64   lr: 0.00191   loss: 2.1190\n",
            "Model predictions at step 40\n"
          ]
        },
        {
          "output_type": "display_data",
          "data": {
            "text/plain": [
              "<IPython.core.display.HTML object>"
            ],
            "text/html": [
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0046&gt;&lt;loc0238&gt;&lt;loc0896&gt;&lt;loc0912&gt; plus</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0219&gt;&lt;loc0261&gt;&lt;loc0965&gt;&lt;loc0814&gt; 2</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0116&gt;&lt;loc0230&gt;&lt;loc0919&gt;&lt;loc0653&gt; 4</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0140&gt;&lt;loc0369&gt;&lt;loc0662&gt;&lt;loc0447&gt; 1</p>\n",
              "</div>\n"
            ]
          },
          "metadata": {}
        },
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "step: 41/64   lr: 0.00178   loss: 2.3048\n",
            "step: 42/64   lr: 0.00165   loss: 2.2890\n",
            "step: 43/64   lr: 0.00153   loss: 2.4525\n",
            "step: 44/64   lr: 0.00141   loss: 2.3838\n",
            "step: 45/64   lr: 0.00129   loss: 2.2427\n",
            "step: 46/64   lr: 0.00117   loss: 2.0810\n",
            "step: 47/64   lr: 0.00106   loss: 2.2332\n",
            "step: 48/64   lr: 0.00096   loss: 2.1453\n",
            "Model predictions at step 48\n"
          ]
        },
        {
          "output_type": "display_data",
          "data": {
            "text/plain": [
              "<IPython.core.display.HTML object>"
            ],
            "text/html": [
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0046&gt;&lt;loc0238&gt;&lt;loc0896&gt;&lt;loc0919&gt; plus</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0219&gt;&lt;loc0261&gt;&lt;loc0961&gt;&lt;loc0824&gt; 2</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0116&gt;&lt;loc0230&gt;&lt;loc0919&gt;&lt;loc0653&gt; 4</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0141&gt;&lt;loc0374&gt;&lt;loc0662&gt;&lt;loc0450&gt; 1</p>\n",
              "</div>\n"
            ]
          },
          "metadata": {}
        },
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "step: 49/64   lr: 0.00085   loss: 2.1180\n",
            "step: 50/64   lr: 0.00076   loss: 2.5212\n",
            "step: 51/64   lr: 0.00066   loss: 2.3260\n",
            "step: 52/64   lr: 0.00058   loss: 2.3060\n",
            "step: 53/64   lr: 0.00049   loss: 2.3717\n",
            "step: 54/64   lr: 0.00042   loss: 2.3142\n",
            "step: 55/64   lr: 0.00035   loss: 1.9866\n",
            "step: 56/64   lr: 0.00028   loss: 2.1863\n",
            "Model predictions at step 56\n"
          ]
        },
        {
          "output_type": "display_data",
          "data": {
            "text/plain": [
              "<IPython.core.display.HTML object>"
            ],
            "text/html": [
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0046&gt;&lt;loc0233&gt;&lt;loc0896&gt;&lt;loc0919&gt; plus</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0219&gt;&lt;loc0261&gt;&lt;loc0957&gt;&lt;loc0824&gt; 2</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0121&gt;&lt;loc0222&gt;&lt;loc0919&gt;&lt;loc0653&gt; 4</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0141&gt;&lt;loc0374&gt;&lt;loc0655&gt;&lt;loc0450&gt; 1</p>\n",
              "</div>\n"
            ]
          },
          "metadata": {}
        },
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "step: 57/64   lr: 0.00022   loss: 2.1201\n",
            "step: 58/64   lr: 0.00017   loss: 2.2197\n",
            "step: 59/64   lr: 0.00013   loss: 2.1977\n",
            "step: 60/64   lr: 0.00009   loss: 2.2539\n",
            "step: 61/64   lr: 0.00006   loss: 2.0506\n",
            "step: 62/64   lr: 0.00003   loss: 2.4619\n",
            "step: 63/64   lr: 0.00001   loss: 2.2024\n",
            "step: 64/64   lr: 0.00000   loss: 2.3148\n",
            "Model predictions at step 64\n"
          ]
        },
        {
          "output_type": "display_data",
          "data": {
            "text/plain": [
              "<IPython.core.display.HTML object>"
            ],
            "text/html": [
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0046&gt;&lt;loc0233&gt;&lt;loc0896&gt;&lt;loc0912&gt; plus</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0219&gt;&lt;loc0261&gt;&lt;loc0957&gt;&lt;loc0824&gt; 2</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0116&gt;&lt;loc0222&gt;&lt;loc0919&gt;&lt;loc0653&gt; 4</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0141&gt;&lt;loc0374&gt;&lt;loc0658&gt;&lt;loc0450&gt; 1</p>\n",
              "</div>\n"
            ]
          },
          "metadata": {}
        },
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "CPU times: user 11min 12s, sys: 463 ms, total: 11min 12s\n",
            "Wall time: 11min 14s\n"
          ]
        }
      ],
      "source": [
        "# Run a short training loop with cosine learning rate schedule.\n",
        "#\n",
        "# Note: the first step can be quite slow on some machines (up to several minutes)\n",
        "# due to XLA compilation of the jax.jit'd function.\n",
        "#\n",
        "%%time\n",
        "\n",
        "BATCH_SIZE = 8\n",
        "TRAIN_EXAMPLES = 512\n",
        "LEARNING_RATE = 0.005\n",
        "\n",
        "TRAIN_STEPS = TRAIN_EXAMPLES // BATCH_SIZE\n",
        "EVAL_STEPS = TRAIN_STEPS // 8\n",
        "\n",
        "train_data_it = train_data_iterator()\n",
        "\n",
        "sched_fn = big_vision.utils.create_learning_rate_schedule(\n",
        "    total_steps=TRAIN_STEPS+1, base=LEARNING_RATE,\n",
        "    decay_type=\"cosine\", warmup_percent=0.10)\n",
        "\n",
        "for step in range(1, TRAIN_STEPS+1):\n",
        "  # Make list of N training examples.\n",
        "  examples = [next(train_data_it) for _ in range(BATCH_SIZE)]\n",
        "\n",
        "  # Convert list of examples into a dict of np.arrays and load onto devices.\n",
        "  batch = jax.tree.map(lambda *x: np.stack(x), *examples)\n",
        "  batch = big_vision.utils.reshard(batch, data_sharding)\n",
        "\n",
        "  # Training step and report training loss\n",
        "  learning_rate = sched_fn(step)\n",
        "  params, loss = update_fn(params, batch, learning_rate)\n",
        "\n",
        "  loss = jax.device_get(loss)\n",
        "  print(f\"step: {step:2d}/{TRAIN_STEPS:2d}   lr: {learning_rate:.5f}   loss: {loss:.4f}\")\n",
        "\n",
        "  if (step % EVAL_STEPS) == 0:\n",
        "    print(f\"Model predictions at step {step}\")\n",
        "    html_out = \"\"\n",
        "    for image, _, caption in make_predictions(\n",
        "        validation_data_iterator(), num_examples=4, batch_size=4):\n",
        "      html_out += render_example(image, caption)\n",
        "    display(HTML(html_out))\n"
      ]
    },
    {
      "cell_type": "markdown",
      "source": [
        "## Evaluate fine-tuned model"
      ],
      "metadata": {
        "id": "rGjB_4Mo8RA_"
      }
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 548
        },
        "id": "hgUhEKjzPdMQ",
        "outputId": "59b2fe9a-5ba4-48e1-a08d-fcc7c91049b9"
      },
      "outputs": [
        {
          "output_type": "display_data",
          "data": {
            "text/plain": [
              "<IPython.core.display.HTML object>"
            ],
            "text/html": [
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0046&gt;&lt;loc0233&gt;&lt;loc0896&gt;&lt;loc0912&gt; plus</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0219&gt;&lt;loc0261&gt;&lt;loc0957&gt;&lt;loc0824&gt; 2</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0116&gt;&lt;loc0222&gt;&lt;loc0919&gt;&lt;loc0653&gt; 4</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0141&gt;&lt;loc0374&gt;&lt;loc0658&gt;&lt;loc0450&gt; 1</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0112&gt;&lt;loc0155&gt;&lt;loc0927&gt;&lt;loc0639&gt; 3</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0127&gt;&lt;loc0297&gt;&lt;loc0785&gt;&lt;loc0673&gt; 4</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0058&gt;&lt;loc0261&gt;&lt;loc0831&gt;&lt;loc0662&gt; 8</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0222&gt;&lt;loc0453&gt;&lt;loc0735&gt;&lt;loc0819&gt; 7</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0449&gt;&lt;loc0180&gt;&lt;loc0533&gt;&lt;loc0797&gt; minus</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0266&gt;&lt;loc0502&gt;&lt;loc0936&gt;&lt;loc0626&gt; 1</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0172&gt;&lt;loc0222&gt;&lt;loc0912&gt;&lt;loc0634&gt; 5</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0230&gt;&lt;loc0363&gt;&lt;loc0894&gt;&lt;loc0738&gt; 7</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0266&gt;&lt;loc0572&gt;&lt;loc0827&gt;&lt;loc0766&gt; 1</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0127&gt;&lt;loc0230&gt;&lt;loc0901&gt;&lt;loc0703&gt; 6</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0317&gt;&lt;loc0140&gt;&lt;loc0936&gt;&lt;loc0812&gt; mult</p>\n",
              "</div>\n",
              "\n",
              "<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n",
              "    <img style=\"width:128px; height:128px;\" src=\"\" />\n",
              "    <p style=\"width:256px; margin:10px; font-size:small;\">&lt;loc0161&gt;&lt;loc0261&gt;&lt;loc0838&gt;&lt;loc0831&gt; 6</p>\n",
              "</div>\n"
            ]
          },
          "metadata": {}
        }
      ],
      "source": [
        "# @title Visualize results\n",
        "html_out = \"\"\n",
        "for image, _, caption in make_predictions(validation_data_iterator(), num_examples=16, batch_size=8):\n",
        "  html_out += render_example(image, caption)\n",
        "display(HTML(html_out))"
      ]
    },
    {
      "cell_type": "code",
      "source": [
        "# @title Collect predictions\n",
        "targets = []\n",
        "predictions = []\n",
        "\n",
        "for image, label, prediction in make_predictions(validation_data_iterator(), num_examples=512, batch_size=8):\n",
        "    h, w, _ = image.shape\n",
        "    target = sv.Detections.from_lmm(\n",
        "        lmm='paligemma',\n",
        "        result=label,\n",
        "        resolution_wh=(w, h),\n",
        "        classes=CLASSES)\n",
        "    targets.append(target)\n",
        "    prediction = sv.Detections.from_lmm(\n",
        "        lmm='paligemma',\n",
        "        result=prediction,\n",
        "        resolution_wh=(w, h),\n",
        "        classes=CLASSES)\n",
        "    prediction.confidence = np.ones(len(prediction))\n",
        "    predictions.append(prediction)"
      ],
      "metadata": {
        "id": "JD9l94a8pYRc"
      },
      "execution_count": null,
      "outputs": []
    },
    {
      "cell_type": "code",
      "source": [
        "# @title Calculate mAP\n",
        "mean_average_precision = sv.MeanAveragePrecision.from_detections(\n",
        "    predictions=predictions,\n",
        "    targets=targets,\n",
        ")\n",
        "\n",
        "print(f\"map50_95: {mean_average_precision.map50_95:.2f}\")\n",
        "print(f\"map50: {mean_average_precision.map50:.2f}\")\n",
        "print(f\"map75: {mean_average_precision.map75:.2f}\")"
      ],
      "metadata": {
        "id": "d62AueMC7Yp3",
        "outputId": "487bb72c-c023-47c9-a33b-acdbf6ac1be0",
        "colab": {
          "base_uri": "https://localhost:8080/"
        }
      },
      "execution_count": null,
      "outputs": [
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "map50_95: 0.83\n",
            "map50: 0.94\n",
            "map75: 0.90\n"
          ]
        }
      ]
    },
    {
      "cell_type": "code",
      "source": [
        "# @title Calculate Confusion Matrix\n",
        "confusion_matrix = sv.ConfusionMatrix.from_detections(\n",
        "    predictions=predictions,\n",
        "    targets=targets,\n",
        "    classes=CLASSES\n",
        ")\n",
        "\n",
        "_ = confusion_matrix.plot()"
      ],
      "metadata": {
        "id": "lE7je-hL8Bj3",
        "outputId": "86db3d53-0caa-4186-ae8f-0e0460081c61",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 1000
        }
      },
      "execution_count": null,
      "outputs": [
        {
          "output_type": "display_data",
          "data": {
            "text/plain": [
              "<Figure size 1200x1000 with 2 Axes>"
            ],
            "image/png": "iVBORw0KGgoAAAANSUhEUgAABEoAAAPcCAYAAABW3l4wAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/bCgiHAAAACXBIWXMAAA9hAAAPYQGoP6dpAACKl0lEQVR4nOzdeZiVdd0/8PcZlgFZBlABF1BUXHNLUzFzV6S0fDRXXDLTMrWUtF9kmmiG9VTa4pa5Zalp5ZYLqbnmvuBauAvGYi4wDMiAzPz+4G6eJtFEhzlnzrxec93XxX2fe+a8BzPhPZ/v9y41Nzc3BwAAAIDUlDsAAAAAQKVQlAAAAAAUFCUAAAAABUUJAAAAQEFRAgAAAFBQlAAAAAAUFCUAAAAABUUJAAAAQKFruQMAAAAAH828efMyf/78csdYYt27d0+PHj3KHaMVRQkAAAB0YPPmzUvPPssm78wtd5QlNnjw4Lz00ksVVZYoSgAAAKADmz9/fvLO3NSue3DSpXu543xwC+dn+jOXZP78+YoSAAAAoI116Z5SBypKmssd4D0oSgAAAKAalGoWHR1FhWatzFQAAAAAZaAoAQAAACgoSgAAAAAK9igBAACAalBKUiqVO8UHV6FRTZQAAAAAFBQlAAAAAAVLbwAAAKAaeDxwm6jMVAAAAABloCgBAAAAKChKAAAAAAr2KAEAAIBqUCp1sMcDV2ZWEyUAAAAABUUJAAAAQMHSm4+oqakpU6dOTZ8+fVKq0LEhAACAzq65uTmzZ8/OiiuumJoaMwO8N0XJRzR16tQMGTKk3DEAAAD4AKZMmZKVV1653DGWjlLNoqOjqNCsipKPqE+fPkmS7h//akpdasucpu29duvJ5Y4AAHxADY1N5Y6w1LzT1FzuCEtNv55dyh0BOoX6+voMGTKk5e9w8F4UJR/Rv5bblLrUptS1+oqSvn37ljsCAPAB1ShKOqS+ihJoV7ZM4L9RlAAAAEA18HjgNlGZC4IAAAAAykBRAgAAAFBQlAAAAAAU7FECAAAAVaGDPR64Qmc3KjMVAAAAQBkoSgAAAAAKlt4AAABANfB44DZhogQAAACgoCgBAAAAKChKAAAAAAr2KAEAAIBqUOpgjweu0KyVmQoAAACgDBQlAAAAAAVLbwAAAKAaeDxwmzBRAgAAAFBQlAAAAAAUFCUAAAAABXuUAAAAQDXweOA2UZmpOrkTDt0hb987vtUx8fJjF3vvNT/+Qt6+d3x223rddk4JAHRG99x9V/ba47MZPmzl9OnRJddfd025I7WZn/zv6dl+qy0yZGC/DF9lhYzee4889+ykcscCoJ0pSgpnnXVWVl111fTo0SObb755HnzwwbLmefrF6Vl119Najh2+ct677jl6n0+mubkM4QCATmvu3DlZf/0N8+Mzf17uKG3u3rvvype+fET+fMdf88frb86CBQuyx26jMmfOnHJHA6AdWXqT5He/+13GjBmTc889N5tvvnnOPPPMjBw5MpMmTcrAgQPLkumdd5oy482G93x9g+Er5Ov7fSqf/OIv8vKfTmjHZABAZ7bzyFHZeeSocsdYKn5/3Y2tzs/+5YUZvsoKmfjYI/nkVluXKRUA7c1ESZKf/OQnOeyww3LIIYdk3XXXzbnnnptlllkmF154YdkyrTFkubx47dg8c9Xxuei7+2TIoLqW13rWdsvFJ++TY3587fuWKQAAfHj19bOSJP37DyhzEoAPqFTqeEcF6vRFyfz58/PII49kxx13bLlWU1OTHXfcMffdd9+77m9sbEx9fX2ro6099PSUHP69q/LZMRflaz+6Jquu2D+3nvPl9F6me5Lkh1//TO5/cnL+dPff2vy9AQBImpqaMvb4Mdl8xJZZd72PlTsOAO2o0y+9ef3117Nw4cIMGjSo1fVBgwbl73//+7vuHz9+fMaNG7dUM/35/mdbfv3UC9Pz0NNTMumP/y97br9BXp85J9tusnq2+EL1rQsGAKgUxx1zdP72zNO56dY7yx0FgHbW6YuSJTV27NiMGTOm5by+vj5DhgxZqu85q2Fenp/yelZfedl8bPXBWW2lAZk+4aRW91x+2uj89fGXM/Ko85dqFgCAanf8sV/LhJtuyI233J6VVl653HEAPjiPB24Tnb4oWW655dKlS5fMmDGj1fUZM2Zk8ODB77q/trY2tbW17RUvSdKrZ/cMW2lApt88O3+47YlcdP1DrV5/5DfH5Js/uyE33GMpDgDAh9Xc3Jxvjvl6brjumlw/4bassuqwckcCoAw6fVHSvXv3bLLJJrntttuy++67J1m0JvW2227LUUcdVZZM448alRvu+XsmT38rKy7XN9/50o5ZuLApV97yeF6fOWexG7hOmTEzr0x7qwxpAYDOpKGhIS++8HzL+Ssvv5wnHp+Y/v0HZMjQoWVM9tEdd8zR+f2Vl+eyK/+Y3r37ZMb06UmSvnV16dmzZ5nTAdBeOn1RkiRjxozJwQcfnE033TSbbbZZzjzzzMyZMyeHHHJIWfKsNLAuvx63bwbULZPXZ87JvU+8nG0OPyevz5xTljwAAP/y2CMP59Mjd2g5H/vNbyRJ9j/goJz3q4vKFatNXHj+uUmSXf/t+0uSs867IPsfeHA5IgFQBoqSJPvss0/++c9/5qSTTsr06dOz0UYb5eabb37XBq/t5aCTrlii+3tuOXYpJQEAaO1T22yb2fMWljvGUvHW3HfKHQHgoymVKnbfj8Wq0McDK0oKRx11VNmW2gAAAACVoQNVTQAAAABLl4kSAAAAqAY1pUVHR1GhWU2UAAAAABQUJQAAAAAFRQkAAABAwR4lAAAAUA1KNR3s8cCVmbUyUwEAAACUgaIEAAAAoGDpDQAAAFSDUmnR0VFUaFYTJQAAAAAFRQkAAABAQVECAAAAULBHCQAAAFQDjwduE5WZCgAAAKAMFCUAAAAABUUJAAAAQMEeJQAAAFANSqVFR0dRoVlNlAAAAAAUFCUAAAAABUtvAAAAoBp4PHCbqMxUAAAAAGVgoqSNvHbryenbt2+5Y7S5YcfeUO4IS81LZ3ym3BGADq6hsancEZaa3rV+ltIR+ecGAB+d/5oCAAAAFEyUAAAAQDXweOA2YaIEAAAAoKAoAQAAAChYegMAAADVwOOB20RlpgIAAAAoA0UJAAAAQEFRAgAAAFCwRwkAAABUA48HbhMmSgAAAAAKihIAAACAgqU3AAAAUBU62OOBK3R2ozJTAQAAAJSBogQAAACgoCgBAAAAKNijBAAAAKqBxwO3CRMlAAAAAAVFCQAAAEBBUQIAAABQsEcJAAAAVINSKSl1oHkIe5TAu31lh9Xz0hmfyYm7r5skqVumW07eY73cNnab/O0Hu+Sek7bPd/9n3fTpodMDOqd77r4re+3x2QwftnL69OiS66+7ptyRAADK4uSTT06pVGp1rL322i2vz5s3L0ceeWSWXXbZ9O7dO3vuuWdmzJixxO+jKKFsNhhSl/1HDM3f/lHfcm1Q39oM7Fub71/3t4z84V05/rLHs83ay+cH+25QxqQA5TN37pysv/6G+fGZPy93FACAsltvvfUybdq0luOee+5pee3YY4/N9ddfn6uuuip33nlnpk6dmj322GOJ36PT/5j+rrvuyv/+7//mkUceybRp03L11Vdn9913L3esqrdM9y4584CNMvbKJ3LUTsNbrj87vSFfvfjRlvPJb8zNj26clJ8csFG61JSysKm5HHEBymbnkaOy88hR5Y4BAHQEpZoOtvRmybN27do1gwcPftf1WbNm5YILLshll12W7bffPkly0UUXZZ111sn999+fLbbY4gO/Rwf6HVw65syZkw033DBnnXVWuaN0Kqd8/mP5y99ey1+ffeO/3tunR7c0zHtHSQIAAFCF6uvrWx2NjY3vee9zzz2XFVdcMauttlpGjx6dyZMnJ0keeeSRLFiwIDvuuGPLvWuvvXaGDh2a++67b4nydPqJklGjRmXUKD+pa0+7brxC1lupbz53xl//6739e3XL0TuvkSvum9IOyQAAAGhvQ4YMaXX+3e9+NyeffPK77tt8881z8cUXZ6211sq0adMybty4fOpTn8pTTz2V6dOnp3v37unXr1+rzxk0aFCmT5++RHk6fVGypBobG1u1W/X19e9zN/9phX498t3/WS8HnvNA5r/T9L739q7tmgsP+0Sem9GQM29+tp0SAgAA0J6mTJmSvn37tpzX1tYu9r5/H3LYYIMNsvnmm2eVVVbJlVdemZ49e7ZZHkXJEho/fnzGjRtX7hgd1sdWrstyfWpz/Te2arnWtUtNNlttQA7aapWsdfxNaWpOetV2ycVf3iwNjQvz5QsfyTuW3QAAALy/UqliH7m7WEXWvn37tipKPqh+/fplzTXXzPPPP5+ddtop8+fPz8yZM1tNlcyYMWOxe5q8H0XJEho7dmzGjBnTcl5fX/+uMSHe273PvZ6RP7iz1bUf7rdhXnytIefe9kKamhdNklzylc0y/52mHParh/7r5AkAAACdT0NDQ1544YUceOCB2WSTTdKtW7fcdttt2XPPPZMkkyZNyuTJkzNixIgl+rqKkiVUW1v7nmNA/HdzGhfm2ekNra69PX9h3pqzIM9Ob0jv2q759Vc2S8/uXXLsbyamd49u6d1j0X1vNjTGYAnQ2TQ0NOTFF55vOX/l5ZfzxOMT07//gAwZOrSMyQAA2tdxxx2X3XbbLausskqmTp2a7373u+nSpUv222+/1NXV5dBDD82YMWMyYMCA9O3bN0cffXRGjBixRE+8SRQlVJj1Vu6bjVftnyS58zvbtXptq1P+kn+89XY5YgGUzWOPPJxPj9yh5XzsN7+RJNn/gINy3q8uKlcsAKASVfnjgV999dXst99+eeONN7L88stnq622yv3335/ll18+SXLGGWekpqYme+65ZxobGzNy5MicffbZSxyr0xclDQ0Nef75//tJ3UsvvZSJEydmwIABGeonde1iv7Pub/n1Ay+8mWHH3lDGNACV5VPbbJvZ8xaWOwYAQNldccUV7/t6jx49ctZZZ+Wss876SO/T6YuShx9+ONtt93+TC//af+Tggw/OxRdfXKZUAAAAQDl0+qJk2223TXOzjS8AAAAARQkAAABUhw76eOBK04F2eQEAAABYuhQlAAAAAAVLbwAAAKAaVPnjgdtLZaYCAAAAKANFCQAAAEBBUQIAAABQsEcJAAAAVAOPB24TJkoAAAAACooSAAAAgIKiBAAAAKBgjxIAAACoAqVSKaUK3fdjsSo0q4kSAAAAgIKiBAAAAKBg6Q0AAABUAUtv2oaJEgAAAICCogQAAACgoCgBAAAAKNijBAAAAKpBqTg6igrNaqIEAAAAoKAoAQAAAChYesP7eumMz5Q7wlKz4XdvK3eEpebxcTuUOwJ0Cr1r/bwBAKgcHg/cNvwJDwAAAKCgKAEAAAAoKEoAAAAACvYoAQAAgCpgj5K2YaIEAAAAoKAoAQAAACgoSgAAAAAK9igBAACAKmCPkrZhogQAAACgoCgBAAAAKFh6AwAAAFXA0pu2YaIEAAAAoKAoAQAAACgoSgAAAAAK9igBAACAalAqjo6iQrOaKAEAAAAoKEoAAAAACpbeAAAAQBXweOC2YaIEAAAAoKAoAQAAAChYegNtZK9PrJS9N10pK/brmSR54Z9zct4dL+Wvz7+RJFm5f898Y+Qa2Whov3TvUpO/Pv9GTr/x2bw5Z345YwMAAPBvFCXQRl6b1Zif3vpCJr8xN6VSsttGK+Sn+22Qfc59MFNnvp1zD9ooz05vyGEXP5okOXL71fPz/TfIAb96OM3NZQ4PAAB0eKVSOtgeJeUOsHidfunN+PHj84lPfCJ9+vTJwIEDs/vuu2fSpEnljkUHdOezr+ee597I5DffzitvvJ1f3PZi5s5fmA2G9M1GQ/tlxX49c+I1z+T51+bk+dfm5MSrn866K/bNZsP6lzs6AAAAhU5flNx555058sgjc//99+eWW27JggULsvPOO2fOnDnljkYHVlNKdvnYoPTs3iWPT6lP9y41aW5uzvx3mlruaXynKU3Nzdl4aL/yBQUAAKCVTr/05uabb251fvHFF2fgwIF55JFHsvXWW5cpFR3VGgN75dIvbZruXWsyd/7CHHvFE3nxn3Py1pz5eXtBU47ZaY38/LYXUkry9Z3WSNcuNVm+T225YwMAAFWglA72eOAKXXvT6YuS/zRr1qwkyYABAxb7emNjYxobG1vO6+vr2yUXHcPLb8zN3uc+mN61XbPTegNz6v+sm0MvejQv/nNOjr/yyZyw61rZf/MhaWpuzs1PzcgzU+vTZIMSAACAiqEo+TdNTU055phj8slPfjIf+9jHFnvP+PHjM27cuHZORkfxzsLmTHnz7STJ36bNznor9s3oLYbk1Ov/nvteeDO7/vS+9FumWxY2NWf2vHdy23Fb5dW33i5zagAAAP6l0+9R8u+OPPLIPPXUU7niiive856xY8dm1qxZLceUKVPaMSEdTU0p6dal9TjZzLkLMnveO9lsWP8M6NU9d/z99TKlAwAA4D+ZKCkcddRR+dOf/pS77rorK6+88nveV1tbm9pae0rwbl/bcfXc89wbmT5rXpbp3iWf3mBwNl21f464dGKS5HMbrZAXX5+Tt+YsyIZD6vLNUWvmN/dPzitvzC1vcAAAoCqUSh1sj5IKzdrpi5Lm5uYcffTRufrqq3PHHXdk2LBh5Y5EBzWgV/d873/WzfJ9atMw7508O6MhR1w6Mfe/+GaSZNXllsnXdlw9dT27ZerMefnVXS/l0vtMJAEAAFSSTl+UHHnkkbnsssty7bXXpk+fPpk+fXqSpK6uLj179ixzOjqSk6/92/u+/tNbX8hPb32hndIAAADwYXT6PUrOOeeczJo1K9tuu21WWGGFluN3v/tduaMBAAAA7azTT5Q0ezQrAAAA1aBUHB1FhWbt9BMlAAAAAP+iKAEAAAAodPqlNwAAAFAVOtjjgZsrNKuJEgAAAICCogQAAACgoCgBAAAAKNijBAAAAKpAqYPtUVKpWU2UAAAAABQUJQAAAAAFS28AAACgClh60zZMlAAAAAAUFCUAAAAABUUJAAAAQMEeJQAAAFANSsXRUVRoVhMlAAAAAAVFCQAAAEDB0hsAAACoAh4P3DZMlAAAAAAUFCUAAAAABUUJAAAAQMEeJQAAAFAF7FHSNhQldFqPj9uh3BGWmp6fGFPuCEvN2w/9pNwRAACAKmbpDQAAAEBBUQIAAABQsPQGAAAAqoA9StqGiRIAAACAgqIEAAAAoGDpDQAAAFQBS2/ahokSAAAAgIKiBAAAAKCgKAEAAAAo2KMEAAAAqkGpODqKCs1qogQAAACgoCgBAAAAKFh6AwAAAFXA44HbhokSAAAAgIKiBAAAAKCgKAEAAAAo2KMEAAAAqoA9StqGiRIAAACAgqIEAAAAoGDpDQAAAFQBS2/ahqIE+K9OOGxkvnP4yFbXJr08Ixvt9YMkyYRzv5qtN1mj1evn/+HefO3037dbRgAAgLagKAE+kKdfmJbPHHluy/k77zS1ev2Cq+/Lqefd3HI+d978dssGAADQVjr9HiXnnHNONthgg/Tt2zd9+/bNiBEjctNNN5U7FlScdxY2ZcYbs1uON2bNafX62/MWtHp99pzGMiUFAAD48Dr9RMnKK6+c008/PcOHD09zc3MuueSSfO5zn8tjjz2W9dZbr9zxoGKsMWS5vHjjdzNv/jt54MmXc9IvbsiUGTNbXt9nl49n31Efz4w3ZufGu5/J+F/9OW83LihfYAAA6GxKxdFRVGjWTl+U7Lbbbq3OTzvttJxzzjm5//77FSVQeOjpV3L4uCvy7CuvZfByfXPCYTvn1vOPyib7/m8a5jbmdxMezeRpb2XaP+uz/vAV8r2jds2aqyyffb95cbmjAwAALJFOX5T8u4ULF+aqq67KnDlzMmLEiMXe09jYmMbG/1tSUF9f317xoGz+fO/fW3791PPT8tBTr2TS9Sdmzx03yiXXPZALr76/5fWnX5iWaa/X5+ZzvpphKy2bl/7xRjkiAwAAfCidfo+SJHnyySfTu3fv1NbW5itf+UquvvrqrLvuuou9d/z48amrq2s5hgwZ0s5pofxmNczL85P/mdWHLLfY1x96anKSvOfrAAAAlUpRkmSttdbKxIkT88ADD+SII47IwQcfnGeeeWax944dOzazZs1qOaZMmdLOaaH8evXsnmErLZfpry9+omrDNVdMkvd8HQAAaHulUqnDHZXI0psk3bt3zxprrJEk2WSTTfLQQw/lpz/9ac4777x33VtbW5va2tr2jghlNf7ru+WGu5/J5GlvZsXl6/Kdw0dmYVNTrpzwaIattGz22eXjmfDXv+WNWXOy/vAV88NjP5e7H30hTz0/rdzRAQAAloiiZDGamppa7UMCnd1KA/vl1987IAPqeuX1txpy7+MvZZtDfprXZ85Jj9pu2X6zNXPUvlunV8/ueXXGzFzzlydy+oW3lDs2AADAEuv0RcnYsWMzatSoDB06NLNnz85ll12WO+64IxMmTCh3NKgYB51w6Xu+9uqMmdn5y2e1YxoAAGBxKnk5y+JUatZOX5S89tprOeiggzJt2rTU1dVlgw02yIQJE7LTTjuVOxoAAADQzjp9UXLBBReUOwIAAABQITz1BgAAAKDQ6SdKAAAAoBqU0sH2KEllZjVRAgAAAFBQlAAAAAAULL0BAACAKuDxwG3DRAkAAABAQVECAAAAUFCUAAAAABTsUQIAAADVoFQcHUWFZjVRAgAAAFBQlAAAAAAULL0BAACAKuDxwG3DRAkAAABAQVECAAAAUFCUAAAAABTsUQIAAABVwB4lbcNECQAAAEBBUQIAAABQUJQAAAAAFOxRAgAAAFWgVFp0dBSVmtVECQAAAEDBREkbmfn2wjR1W1juGG2uX88u5Y7Ah/D2Qz8pd4SlZsWv/LHcEZaaqefuUe4IABWrobGp3BGWmt61fnYJUEkUJQAAAFAFFi29qdD1LItRqVHV1wAAAAAFRQkAAABAQVECAAAAULBHCQAAAFSDDvZ44FRoVhMlAAAAAAVFCQAAAEDB0hsAAACoAqVSqYM9Hrgys5ooAQAAACgoSgAAAAAKihIAAACAgj1KAAAAoAqUOtjjgSs1q4kSAAAAgIKiBAAAAKBg6Q0AAABUgZqaUmpqKnQ9y2I0V2hWEyUAAAAABUUJAAAAQEFRAgAAAFBQlAAAAEAV+NfjgTvS8WGdfvrpKZVKOeaYY1quzZs3L0ceeWSWXXbZ9O7dO3vuuWdmzJixxF9bUQIAAAB0GA899FDOO++8bLDBBq2uH3vssbn++utz1VVX5c4778zUqVOzxx57LPHXV5R0AD/539Oz/VZbZMjAfhm+ygoZvfceee7ZSeWOBVXnqJFrZuq5e2TcXhss9vXfHLVlpp67R3bZcIV2TgZAe7jn7ruy1x6fzfBhK6dPjy65/rpryh0JgP/Q0NCQ0aNH5/zzz0///v1brs+aNSsXXHBBfvKTn2T77bfPJptskosuuij33ntv7r///iV6D0VJB3Dv3XflS18+In++46/54/U3Z8GCBdljt1GZM2dOuaNB1dhwlf454FPD8vSrMxf7+mE7rJHm9o0EQDubO3dO1l9/w/z4zJ+XOwpAp1JfX9/qaGxsfM97jzzyyHzmM5/Jjjvu2Or6I488kgULFrS6vvbaa2fo0KG57777liiPouTfLG6NUyX4/XU3Zv8DD846666X9TfYMGf/8sK8OmVyJj72SLmjQVVYprZLfvHFTXP8bx7NrLkL3vX6eivX5cs7Ds+YX/t3DqCa7TxyVE4ad2o++7n/KXcUgA+lVCp1uCNJhgwZkrq6upZj/Pjxi/3+rrjiijz66KOLfX369Onp3r17+vXr1+r6oEGDMn369CX6fey6RHdXsfda41SJ6utnJUn69x9Q5iRQHb6/70a57anpufvv/8zXP712q9d6duuSsw79RE64YmL+Wf/ezTYAAPDhTJkyJX379m05r62tXew9X//613PLLbekR48eSzWPiZK89xqnStTU1JSxx4/J5iO2zLrrfazccaDD+9ymK2f9of0y/uqnF/v6yXttkIdfeDMTHp/WzskAAKBz6Nu3b6tjcUXJI488ktdeey0f//jH07Vr13Tt2jV33nlnfvazn6Vr164ZNGhQ5s+fn5kzZ7b6vBkzZmTw4MFLlEdRkvde47Q4jY2N71o/1Z6OO+bo/O2Zp3PBJZe16/tCNVqxf8+csvcGOerCh9L4TtO7Xt95gxXyybWXz0lXPV6GdAAAsGTK/ajfpfl44B122CFPPvlkJk6c2HJsuummGT16dMuvu3Xrlttuu63lcyZNmpTJkydnxIgRS/T72OmX3vxrjdNDDz30ge4fP358xo0bt5RTLd7xx34tE266ITfecntWWnnlsmSAarLB0H5Zvm+PTPj29i3XunapyRZrLJdDtl0tv77rpay6XK/8/Se7tfq887+8RR54/vV8/id3t3dkAADolPr06ZOPfaz1qopevXpl2WWXbbl+6KGHZsyYMRkwYED69u2bo48+OiNGjMgWW2yxRO/VqYuSD7PGaezYsRkzZkzLeX19fYYMGbK0IiZJmpub880xX88N112T6yfcllVWHbZU3w86i7v//s9sd8qtra6dcdAmeX767Jz152fzZkNjLr37pVav337Sjjn5qify5ycsxQEAgEpyxhlnpKamJnvuuWcaGxszcuTInH322Uv8dTp1UfLva5z+ZeHChbnrrrvyi1/8Io2NjenSpUurz6mtrV3seqml6bhjjs7vr7w8l135x/Tu3Sczih17+9bVpWfPnu2aBarJnMZ3Mmlq6+Vzc+e/k7fmzG+5vrgNXP/x5txMeWNuu2QEoP00NDTkxReebzl/5eWX88TjE9O//4AMGTq0jMkAWJw77rij1XmPHj1y1lln5ayzzvpIX7dTFyX/WuP07w455JCsvfba+X//7/+9qyQplwvPPzdJsuvIHVpdP+u8C7L/gQeXIxIAQNV57JGH8+l/+/PW2G9+I0my/wEH5bxfXVSuWAAf2L8/crcjqNSsnboo+SBrnCrBW3PfKXcE6DT+274jK37lj+2UBID29qltts3seQvLHQOAMvPUGwAAAIBCp54oWZz/XOMEAAAAHYGlN23DRAkAAABAQVECAAAAUFCUAAAAABTsUQIAAABVoFRadHQUlZrVRAkAAABAQVECAAAAUFCUAAAAABTsUQIAAABVoJRSSpW68cdilFKZWU2UAAAAABQUJQAAAAAFS28AAACgCng8cNswUQIAAABQUJQAAAAAFBQlAAAAAAV7lAAAAEAVKJU62OOBKzSriRIAAACAgqIEAAAAoGDpDQAAAFQBjwduGyZKAAAAAAqKEgAAAICCogQAAACgYI+SNtKvZ5f07dml3DGg6k09d49yR1hqDr7siXJHWGou2X+DckdYKt6Y8065Iyw1y/byRwQA6Gg8HrhtmCgBAAAAKChKAAAAAArmagEAAKAKeDxw2zBRAgAAAFBQlAAAAAAUFCUAAAAABXuUAAAAQBXweOC2YaIEAAAAoKAoAQAAACgoSgAAAAAK9igBAACAalBKKnTbj8Wr0KwmSgAAAAAKihIAAACAgqU3AAAAUAU8HrhtmCgBAAAAKChKAAAAAAqKEgAAAICCPUoAAACgCpQ62OOBKzWriRIAAACAgqIEAAAAoGDpDUCV2n6NAdl++LJZrnf3JMk/Zs3LtU++liemzU6SdKspZd+Pr5AtVumXrjWlPDmtIb9++B+pn/dOOWPzHy48/9xc/KvzMnnyK0mStddZN8d96zvZceddypwMqs89d9+Vn57xo0x87NFMnzYtl135h+z22d3LHQvgA/N44LZhogSgSr359oJc+fj0fPfm5/Ldm5/LM9Mb8vWtV8lKdbVJkv03WTEbr9Q3v7jnlYy/9cX0X6ZrvvapVcqcmv+04kor58RTvp/b7n4gt951fz619XY5cJ898vdnni53NKg6c+fOyfrrb5gfn/nzckcBoIw6fVFy8sknt7Ru/zrWXnvtcscC+Mgm/mN2npg6OzNmz8+M2fPzhydmZN47TVl92WXSs1tNtl6tfy57dFr+NmNOXn7r7fzq/lczfPleWX3ZZcodnX+zy6d3zU4jR2X1NYZnjeFr5oSTT02v3r3z8EMPlDsaVJ2dR47KSeNOzWc/9z/ljgJAGVl6k2S99dbLrbfe2nLetavfFqC6lErJZkPrUtu1Js+/PjerDuiZrl1q8sz02S33TKtvzOtz5meN5ZbJC2/MLWNa3svChQtz7R9/n7lz5uQTm21R7jgAAFVJI5BFxcjgwYPLHQOgza1c1yMn7rx6unWpybx3mvKzu1/J1PrGDO3fMwsWNmXugqZW99fPeyd1Pf2nodI889STGbXDpzJv3rz06t07l1z++6y1zrrljgUAVBiPB24bnX7pTZI899xzWXHFFbPaaqtl9OjRmTx58nve29jYmPr6+lYHQKWaNrsxJ970XE6Z8Hxuf+6NHLbFkKzYt7bcsVhCa6y5Vm6/9+FMuOOvOeRLX85Rh38xk/72TLljAQBUpU5flGy++ea5+OKLc/PNN+ecc87JSy+9lE996lOZPXv2Yu8fP3586urqWo4hQ4a0c2KAD25hU3Nea5ifl996O1c9Pj1TZr6dnddaLrPmLUi3LjVZplvr/wz07dE1s9721JtK071796y2+hrZaONNcuK407Le+hvkvLNtNgkAsDR0+qJk1KhR2WuvvbLBBhtk5MiRufHGGzNz5sxceeWVi71/7NixmTVrVssxZcqUdk4M8OGVUkrXLqW8/ObbeWdhU9Yd3LvltcF9arNcr+55/nX7k1S6pqamzJ/fWO4YAECF+c8HlXSEoxJZiP4f+vXrlzXXXDPPP//8Yl+vra1Nba2xdaDy7bXh4DwxdXbemDs/Pbp2yYhV+2XtQb3yo9tfy9sLmnLXi29lv4+vmIbGhZm3oCkHbLpinvvnHBu5VphTv3tCdthpl6w8ZEgaZs/OH666In+9+85cde2N5Y4GVaehoSEvvvB/fwZ85eWX88TjE9O//4AMGTq0jMkAaE+Kkv/Q0NCQF154IQceeGC5owB8JH16dM1hI4akX8+ueXtBU6bMfDs/uv2lPD29IUly2SNT09S8Qo7+1Crp1qUmT06bnV8/9I8yp+Y/vf7P13Lk4YdkxvRp6du3Lut+bP1cde2N2Xb7HcsdDarOY488nE+P3KHlfOw3v5Ek2f+Ag3Lery4qVywA2lmnL0qOO+647LbbbllllVUyderUfPe7302XLl2y3377lTsawEdy4QOvvu/rC5qac+nDU3Ppw1PbKREfxk/PPr/cEaDT+NQ222b2vIXljgFAmXX6ouTVV1/NfvvtlzfeeCPLL798ttpqq9x///1Zfvnlyx0NAAAAPrBK3vdjcSo1a6cvSq644opyRwAAAAAqRKd/6g0AAADAvyhKAAAAAAqdfukNAAAAVINSadHRUVRqVhMlAAAAAAVFCQAAAEDB0hsAAACoAh4P3DZMlAAAAAAUFCUAAAAABUUJAAAAQMEeJQAAAFAFPB64bZgoAQAAACgoSgAAAAAKlt4AAABAFfB44LZhogQAAACgoCgBAAAAKChKAAAAAAr2KAEAAIAqUErlPnJ3cSo1qokSAAAAgIKiBAAAAKBg6Q0AAABUgZpSKTUdaO1NpWY1UQIAAABQMFHSRhoam1LT2FTuGG2ud60uDdrLJftvUO4IS03vvS8ud4SlouHKL5Q7AnQa/kwCQHvxXxwAAACAgokSAAAAqAKlUgd7PHCFZjVRAgAAAFBQlAAAAAAUFCUAAAAABXuUAAAAQBUolUopVerGH4tRqVlNlAAAAAAUFCUAAAAABUtvAAAAoArUlBYdHUWlZjVRAgAAAFBQlAAAAAAUFCUAAAAABXuUAAAAQDUoVe4jdxerQqOaKAEAAAAoKEoAAAAACpbeAAAAQBUolRYdHUWlZjVRAgAAAFBQlAAAAAAUFCUAAAAABXuUAAAAQBUoFR8dRaVmNVHSQdxz913Za4/PZviwldOnR5dcf9015Y4EUBHGfG79NFz5hfzg4M1arg2s65nzj/pUXvjlPpnx69G55/Td8rnNVyljSgAAOgpFSQcxd+6crL/+hvnxmT8vdxSAivHx1ZfNF3daM0++/Gar6+cftVWGr9g3e//gtmx+3LW57sFX8utjt8kGqw4oU1IAADoKRUmSf/zjHznggAOy7LLLpmfPnll//fXz8MMPlztWKzuPHJWTxp2az37uf8odBaAi9KrtmguO3jpHnXdvZs6Z3+q1zdcamHNv+lseeeH1vPxaQ374xycyc878bLzasmVKCwCw9NWUOt5RiTp9UfLWW2/lk5/8ZLp165abbropzzzzTH784x+nf//+5Y4GwPv4yZe2yITHXs0dT05712sPTHote245LP17dU+plHx+y2Hp0a1L7n56ehmSAgDQkXT6zVx/8IMfZMiQIbnoootarg0bNqyMiQD4bz6/5bBsNGzZbD32T4t9/aAz7swlx2yTKRftnwXvNGXu/Hey349uz4szZrdzUgAAOppOP1Fy3XXXZdNNN81ee+2VgQMHZuONN87555//nvc3Njamvr6+1QFA+1lp2WXywy9sli/+7K40Lli42HtO3Gfj1PXqnl1PmZBPjb0+v/jT0/n1sdtmvSH92jUrAAAdT6efKHnxxRdzzjnnZMyYMfn2t7+dhx56KF/72tfSvXv3HHzwwe+6f/z48Rk3blwZkgKQJBuvtlwG9uuZv/5gt5ZrXbvU5JPrDMqXd1k7Gx9zdb4yap18Ysw1+durM5MkT73yVrZce1AO32WdfP38+8qUHABg6SqVSimVKnTjj8Wo1KydvihpamrKpptumu9///tJko033jhPPfVUzj333MUWJWPHjs2YMWNazuvr6zNkyJB2ywvQ2d3x5NRs9o1rWl0754it8uzUWTnj2iezTPcuSZKm5uZW9yxsaq7YDcMAAKgcnb4oWWGFFbLuuuu2urbOOuvkD3/4w2Lvr62tTW1tbXtEa6WhoSEvvvB8y/krL7+cJx6fmP79B2TI0KHtngegXBrmvZNnpsxsdW1u4zt5c3ZjnpkyM127lPL8tPr87LAR+falD+fNhsbs+omh2X6DFfP5H9xantAAAHQYnb4o+eQnP5lJkya1uvbss89mlVVWKVOixXvskYfz6ZE7tJyP/eY3kiT7H3BQzvvVRe/1aQCdzjsLm7Pn+FtyyuhNctX/2yG9enTNi9Nn5/Cz7s6fH/tHueMBAFDhOn1Rcuyxx2bLLbfM97///ey999558MEH88tf/jK//OUvyx2tlU9ts21mz1v8poUAnd2ocTe3On9h+uyM/vEd5QkDAFAmpdKio6Oo1Kyd/qk3n/jEJ3L11Vfn8ssvz8c+9rGceuqpOfPMMzN69OhyRwMAAADaWaefKEmSXXfdNbvuumu5YwAAAABlpigBAACAKlBTKqWmUtezLEalZu30S28AAAAA/kVRAgAAAFBQlAAAAAAU7FECAAAAVcDjgduGiRIAAACAgqIEAAAAoGDpDQAAAFSBUqmUUqWuZ1mMSs1qogQAAACgoCgBAAAAKChKAAAAAAr2KAEAAIAq4PHAbcNECQAAAEBBUQIAAABQsPQGAAAAqkBNqZSaSl3PshiVmtVECQAAAEBBUQIAAABQUJQAAAAAFOxRAgAAAFWgVBwdRaVmNVECAAAAUDBR0kZ619akd63eqSNpaGwqd4Slxv8WqTQNV36h3BGWip4bH1XuCEvN24/9otwRAADKwt+mAAAAAAomSgAAAKAKlEqllEqVuvPHu1VqVhMlAAAAAAVFCQAAAEDB0hsAAACoAjWlRUdHUalZTZQAAAAAFBQlAAAAAAVFCQAAAEDBHiUAAABQBTweuG2YKAEAAAAoKEoAAAAACpbeAAAAQJWo0NUsHYqJEgAAAICCogQAAACgoCgBAAAAKNijBAAAAKqAxwO3DRMlAAAAAAVFCQAAAFDxzjnnnGywwQbp27dv+vbtmxEjRuSmm25qeX3evHk58sgjs+yyy6Z3797Zc889M2PGjCV+H0UJAAAAVIGaUsc7lsTKK6+c008/PY888kgefvjhbL/99vnc5z6Xp59+Okly7LHH5vrrr89VV12VO++8M1OnTs0ee+yxxL+P9igBAAAAKt5uu+3W6vy0007LOeeck/vvvz8rr7xyLrjgglx22WXZfvvtkyQXXXRR1llnndx///3ZYostPvD7mCgBAAAAOpSFCxfmiiuuyJw5czJixIg88sgjWbBgQXbccceWe9Zee+0MHTo099133xJ9bRMlsBTdc/dd+ekZP8rExx7N9GnTctmVf8hun9293LGACnbClz+d73zl062uTXppejba43st55tvMCwnH7lrPrH+qlm4sClPPPuP7PbVszKvcUF7xwUA+Mjq6+tbndfW1qa2tnax9z755JMZMWJE5s2bl969e+fqq6/Ouuuum4kTJ6Z79+7p169fq/sHDRqU6dOnL1EeRQksRXPnzsn662+YAw8+JKP3+Xy54wAdxNPPT81nvvLzlvN3Fja1/HrzDYbl2l98NT+66M8Z84Or8s7Cpmyw5kppamouR1QAoIJ01McDDxkypNX17373uzn55JMX+zlrrbVWJk6cmFmzZuX3v/99Dj744Nx5551tmqvTFyWrrrpqXnnllXdd/+pXv5qzzjqrDImoJjuPHJWdR44qdwygg3lnYVNmvDF7sa/98Bt75Owr7siPLrql5dpzr7zWXtEAANrclClT0rdv35bz95omSZLu3btnjTXWSJJssskmeeihh/LTn/40++yzT+bPn5+ZM2e2miqZMWNGBg8evER5Ov0eJQ899FCmTZvWctxyy6I/eO61115lTgZAZ7XG0OXz4p9PyzPXn5yLTjs4Qwb3T5Is3793NttgWP75ZkNuv3hMXr71+/nzr76eLTdarcyJAQA+vH897vdfx/sVJf+pqakpjY2N2WSTTdKtW7fcdtttLa9NmjQpkydPzogRI5YoT6efKFl++eVbnZ9++ulZffXVs80225QpEQCd2UNPvZzDT/pNnn1lRgYvV5cTvjwqt154bDb5/GkZtvJySRbtYzL2jKvzxKRXM3rXzXLjeUdnk72+nxcm/7PM6QEAlp6xY8dm1KhRGTp0aGbPnp3LLrssd9xxRyZMmJC6uroceuihGTNmTAYMGJC+ffvm6KOPzogRI5boiTeJoqSV+fPn5ze/+U3GjBnznuu6Ghsb09jY2HL+n5vOAMBH8ee/PtPy66eem5qHnnw5k248JXvu/PFMemnRRmQX/OGeXHrd/UmSxye9mm03WysHf25ETvr5dWXJDABUhlJxdBRLmvW1117LQQcdlGnTpqWuri4bbLBBJkyYkJ122ilJcsYZZ6SmpiZ77rlnGhsbM3LkyJx99tlLnEtR8m+uueaazJw5M1/4whfe857x48dn3Lhx7RcKgE5tVsPbeX7ya1l9yPK548FnkyR/e7H1zu2TXpresjwHAKBaXXDBBe/7eo8ePXLWWWd95P1GO/0eJf/uggsuyKhRo7Liiiu+5z1jx47NrFmzWo4pU6a0Y0IAOptePbtn2MrLZfrrs/LK1Dcy9bWZWXPVga3uWWOVgZk87c0yJQQAqC4mSgqvvPJKbr311vzxj3983/ve73nO8J8aGhry4gvPt5y/8vLLeeLxienff0CGDB1axmRApRp/7P/khruezOSpb2bFgXX5zlc+k4VNTbny5keSJGdccmu+85XP5Mln/5HHJ72aA3bbPGutOij7H//+P2EBAKpfTamUmg70eOBKzaooKVx00UUZOHBgPvOZz5Q7ClXksUcezqdH7tByPvab30iS7H/AQTnvVxeVKxZQwVYa1C+/Hn9IBtQtk9ffasi9E1/MNgf9OK+/1ZAk+cVld6RHbbf88Bt7pn/dMnny2X9k1yN+kZdefb3MyQEAqoOiJIseJ3TRRRfl4IMPTteufktoO5/aZtvMnrew3DGADuSgb/33EvVHF92SH110SzukAQDofOxRkuTWW2/N5MmT88UvfrHcUQAAAIAyMj6RZOedd05zc3O5YwAAAMCHViotOjqKSs1qogQAAACgoCgBAAAAKFh6AwAAAFWgVCqlVKnrWRajUrOaKAEAAAAoKEoAAAAACooSAAAAgII9SgAAAKAKeDxw2/hQEyV33313DjjggIwYMSL/+Mc/kiSXXnpp7rnnnjYNBwAAANCelrgo+cMf/pCRI0emZ8+eeeyxx9LY2JgkmTVrVr7//e+3eUAAAACA9rLERcn3vve9nHvuuTn//PPTrVu3luuf/OQn8+ijj7ZpOAAAAID2tMR7lEyaNClbb731u67X1dVl5syZbZEJAAAAWEI1pVJqKnXjj8Wo1KxLPFEyePDgPP/88++6fs8992S11VZrk1AAAAAA5bDERclhhx2Wr3/963nggQdSKpUyderU/Pa3v81xxx2XI444YmlkBAAAAGgXS7z05lvf+laampqyww47ZO7cudl6661TW1ub4447LkcfffTSyAgAAAD8Fx4P3DaWuCgplUo54YQTcvzxx+f5559PQ0ND1l133fTu3Xtp5AMAAABoN0tclPxL9+7ds+6667ZlFgAAAICyWuKiZLvttkvpfeZj/vKXv3ykQAAAAADlssRFyUYbbdTqfMGCBZk4cWKeeuqpHHzwwW2VCwAAAFgCpVLpfQcbKk2lZl3iouSMM85Y7PWTTz45DQ0NHzkQAAAAQLks8eOB38sBBxyQCy+8sK2+HAAAAEC7+9Cbuf6n++67Lz169GirLwdLXe/aNusJoU00NDaVO8JSU63/vr392C/KHWGp+fu0OeWOsNSsvUKvckcAgKWiJm04DdEOKjXrEhcle+yxR6vz5ubmTJs2LQ8//HBOPPHENgsGAAAA0N6WuCipq6trdV5TU5O11lorp5xySnbeeec2CwYAAADQ3paoKFm4cGEOOeSQrL/++unfv//SygQAAABQFku0JKhLly7ZeeedM3PmzKUUBwAAAPgw/vV44I50VKIl3jvlYx/7WF588cWlkQUAAACgrJa4KPne976X4447Ln/6058ybdq01NfXtzoAAAAAOqoPvEfJKaeckm984xv59Kc/nST57Gc/22pMprm5OaVSKQsXLmz7lAAAAMD7KpWSmspczbJYFbry5oMXJePGjctXvvKV3H777UszDwAAAEDZfOCipLm5OUmyzTbbLLUwAAAAAOW0RHuUVOqOtAAAAABt4QNPlCTJmmuu+V/LkjfffPMjBQIAAACWXE0H26OkUrMuUVEybty41NXVLa0sAAAAAGW1REXJvvvum4EDBy6tLAAAAABl9YH3KLE/CQAAAFDtlvipNwAAAEDlKZVKHWrIoVKzfuCipKmpaWnmAAAAACi7JXo8MAAAAEA1W6LNXAEAAIDK5PHAbcNECQAAAEBBUQIAAABQUJQAAAAAFBQlAJ3IPXfflb32+GyGD1s5fXp0yfXXXVPuSHRycxpm5/snfTPbf2KdbLTactlvtx3y5MRHyh0LADqkUqnjHZVIUQLQicydOyfrr79hfnzmz8sdBZIk3/nGkbn3rr/kBz8/P9fe9kA+uc32+eI+u2XGtKnljgYAdFKdvihZuHBhTjzxxAwbNiw9e/bM6quvnlNPPTXNzc3ljgbQ5nYeOSonjTs1n/3c/5Q7CmTe22/nlhuvzXHf+V4+scVWWWXY6jnquBMydNXVcvmvzy93PACgk+r0jwf+wQ9+kHPOOSeXXHJJ1ltvvTz88MM55JBDUldXl6997WvljgcAVWvhwneycOHC1NbWtrreo0fPPPrgfWVKBQAdV02plJpKXc+yGJWatdMXJffee28+97nP5TOf+UySZNVVV83ll1+eBx98sMzJAKC69erdJxttsnnOOfMHWX342ll2+YG54ZqrMvGRBzJ01dXLHQ8A6KQ6/dKbLbfcMrfddlueffbZJMnjjz+ee+65J6NGjVrs/Y2Njamvr291AAAfzg9+fn6am5uzzceHZ8NVB+Q3F5yTz+y+V2pqKvMnTABA9ev0EyXf+ta3Ul9fn7XXXjtdunTJwoULc9ppp2X06NGLvX/8+PEZN25cO6cEgOo0dNXVcukfJ2Tu3DlpmD07AwcNzrFfPigrrzKs3NEAgE6q00+UXHnllfntb3+byy67LI8++mguueSS/OhHP8oll1yy2PvHjh2bWbNmtRxTpkxp58QAUH2WWaZXBg4anFkz38pf77wtO4z8TLkjAUCHU9MBj0rU6SdKjj/++HzrW9/KvvvumyRZf/3188orr2T8+PE5+OCD33V/bW3tuzadA+goGhoa8uILz7ecv/Lyy3ni8Ynp339AhgwdWsZkdFb33HFrmpubM2z14XnlpRfzo1NPyLA11sz/7HNguaMBAJ1Upy9K5s6dm5qa1j1Wly5d0tTUVKZEAEvPY488nE+P3KHlfOw3v5Ek2f+Ag3Lery4qVyw6sdn1s3LG+JMzfdo/Utevf3b+9OdyzLe+m27dupU7GgDQSXX6omS33XbLaaedlqFDh2a99dbLY489lp/85Cf54he/WO5oAG3uU9tsm9nzFpY7BrQY9dk9M+qze5Y7BgBUhVJp0dFRVGrWTl+U/PznP8+JJ56Yr371q3nttdey4oor5stf/nJOOumkckcDAAAA2lmnL0r69OmTM888M2eeeWa5owAAAABlVqmbzAIAAAC0u04/UQIAAADVoCal1FTqxh+LUZPKzGqiBAAAAKCgKAEAAAAoKEoAAAAACvYoAQAAgCpQKi06OopKzWqiBAAAAKCgKAEAAAAoWHoDAAAAVaCmtOjoKCo1q4kSAAAAgIKiBAAAAKCgKAEAAAAo2KMEAAAAqkCplNRU6jN3F6NSo5ooAQAAACgoSgAAAAAKlt4AAABAFSiVKnc5y+JUalYTJQAAAAAFRQkAAABAQVECAAAAULBHCQAAAFSBmtKio6Oo1KyKEoAK0bvWkB+VY+0VepU7AgBAWfhTOQAAAEDBRAkAAABUgVLx0VFUalYTJQAAAAAFRQkAAABAQVECAAAAULBHCQAAAFQBjwduGyZKAAAAAAqKEgAAAICCogQAAACgYI8SAAAAqAL2KGkbJkoAAAAACooSAAAAgIKlNwAAAFAFSqVSSqUKXc+yGJWa1UQJAAAAQEFRAgAAAFBQlAAAAAAU7FECAAAAVcDjgduGiRIAAACAgqIEAAAAoGDpDQAAAFSBUmnR0VFUalYTJQAAAAAFRQkAAABAQVGSZPbs2TnmmGOyyiqrpGfPntlyyy3z0EMPlTsWAAAA0M7sUZLkS1/6Up566qlceumlWXHFFfOb3/wmO+64Y5555pmstNJK5Y4HAAAA/1VNqZSaSt34YzEqNWunnyh5++2384c//CE//OEPs/XWW2eNNdbIySefnDXWWCPnnHNOueMBAAAA7ajTFyXvvPNOFi5cmB49erS63rNnz9xzzz1lSgUAAACUQ6dfetOnT5+MGDEip556atZZZ50MGjQol19+ee67776sscYa77q/sbExjY2NLef19fXtGRcAAAAWq6a06OgoKjVrp58oSZJLL700zc3NWWmllVJbW5uf/exn2W+//VJT8+7fnvHjx6eurq7lGDJkSBkSAwAAAEuDoiTJ6quvnjvvvDMNDQ2ZMmVKHnzwwSxYsCCrrbbau+4dO3ZsZs2a1XJMmTKlDIkBAACApaHTL735d7169UqvXr3y1ltvZcKECfnhD3/4rntqa2tTW1tbhnQAAADA0qYoSTJhwoQ0NzdnrbXWyvPPP5/jjz8+a6+9dg455JByRwMAAIAPppRU6BN3F69Cs1p6k2TWrFk58sgjs/baa+eggw7KVlttlQkTJqRbt27ljgYAAAC0IxMlSfbee+/svffe5Y4BAAAAlJmJEgAAAICCiRIAAACoAjUppaZSN/5YjErNaqIEAAAAoKAoAQAAAChYegMAAABVoNTBHg9cqVlNlAAAAAAUFCUAAAAABUUJAAAAQMEeJQAAAFAFakqLjo6iUrOaKAEAAAAoKEoAAAAACpbeAAAAQBWoKZVSU6nP3F2MSs1qogQAAACgoCgBAAAAKChKAAAAAAr2KAEAAIAqUCotOjqKSs1qogQAAACgoCgBAAAAKFh6AwAAAFWgJh3s8cCpzKwmSgAAAAAKihIAAACAgqU3AABVYtyfnyt3hKXmK1usWu4IS82gvt3KHQGAf6MoAQAAgCrg8cBtw9IbAAAAgIKiBAAAAKCgKAEAAAAo2KMEAAAAqkBNOtY0RKVmrdRcAAAAAO1OUQIAAABQsPQGAAAAqkCpVEqpUp+5uxiVmtVECQAAAEBBUQIAAABQUJQAAAAAFOxRAgAAAFWgVBwdRaVmNVECAAAAUFCUAAAAABQsvQEAAIAqUFMqpaZCH7m7OJWa1UQJAAAAQEFRAgAAAFS88ePH5xOf+ET69OmTgQMHZvfdd8+kSZNa3TNv3rwceeSRWXbZZdO7d+/sueeemTFjxhK9j6IEAAAAqHh33nlnjjzyyNx///255ZZbsmDBguy8886ZM2dOyz3HHntsrr/++lx11VW58847M3Xq1Oyxxx5L9D72KAEAAIAqUZm7frSNm2++udX5xRdfnIEDB+aRRx7J1ltvnVmzZuWCCy7IZZddlu233z5JctFFF2WdddbJ/fffny222OIDvY+JEgAAAKBs6uvrWx2NjY0f6PNmzZqVJBkwYECS5JFHHsmCBQuy4447ttyz9tprZ+jQobnvvvs+cB5FCQAAAFA2Q4YMSV1dXcsxfvz4//o5TU1NOeaYY/LJT34yH/vYx5Ik06dPT/fu3dOvX79W9w4aNCjTp0//wHk6xdKbbbfdNhtttFHOPPPMrLrqqjnmmGNyzDHHlDsWAEBFu/uKc/O3v/45r095KV2712bIuhtnp0OPz3JDVmu556LjD8grTzzY6vM2+fS+2e3rp7R33Db185/8MKeN+04OO+LonHr6j8sdB+ADKZUWHR3Fv7JOmTIlffv2bbleW1v7Xz/3yCOPzFNPPZV77rmnzXN1iqLk3z300EPp1atXuWMAAFS8l594KJ/Y7YCstOb6aVr4Tm67+Ce59NtfzJHn35juPZZpue/jo/bOdgd9veW8W23PcsRtM4898nB+fdGvsu7H1i93FIBOoW/fvq2Kkv/mqKOOyp/+9KfcddddWXnllVuuDx48OPPnz8/MmTNbTZXMmDEjgwcP/sBfv9MtvVl++eWzzDLL/PcbAQA6uQO/f0E23nmPDFx1eAavvk52/8YPMuu1qZn63NOt7utW2zN9BizfcvTo1btMiT+6OQ0NOfKwg/Ljn52Tun79yx0HgH/T3Nyco446KldffXX+8pe/ZNiwYa1e32STTdKtW7fcdtttLdcmTZqUyZMnZ8SIER/4faquKJkzZ04OOuig9O7dOyussEJ+/OPWo5KrrrpqzjzzzCTJ/vvvn3322afV6wsWLMhyyy2XX//61+0VGQCgQ5g3Z3aSpGefulbXn7z9uvxgr81y1uGfya0X/ijz571djnht4lvHfS07jvx0tt5uh3JHAeA/HHnkkfnNb36Tyy67LH369Mn06dMzffr0vP32ov/u1NXV5dBDD82YMWNy++2355FHHskhhxySESNGfOAn3iRVuPTm+OOPz5133plrr702AwcOzLe//e08+uij2Wijjd517+jRo7PXXnuloaEhvXsv+snHhAkTMnfu3PzP//xPOycHAKhcTU1Nufnc0zJkvY9n0Kprtlxff7td02/gSumz7MDMeGlSbrngf/P6qy9l35POKmPaD+ea3/8uTz7+WG6+/YM/GQGgkpRKpZQ60CYlS5r1nHPOSbJoH9J/d9FFF+ULX/hCkuSMM85ITU1N9txzzzQ2NmbkyJE5++yzl+h9qqooaWhoyAUXXJDf/OY32WGHRT8FuOSSS1qtWfp3I0eOTK9evXL11VfnwAMPTJJcdtll+exnP5s+ffos9nMaGxtbPaqovr6+jb8LAIDKc+MvxuW1V57LF398eavrm35635ZfDxq2VnoPWD6//n8H582pkzNgxaHtHfND+8erU/Kdb30jV15zY3r06FHuOAAsRnNz83+9p0ePHjnrrLNy1lkfvrCvqqU3L7zwQubPn5/NN9+85dqAAQOy1lprLfb+rl27Zu+9985vf/vbJIuW7Vx77bUZPXr0e77H+PHjWz22aMiQIW37TQAAVJgbfjEuzz5we77ww1+nbvn33wxv5bU3TJK8OfWV9ojWZp6Y+Ghe/+dr2WnrzbPSgJ5ZaUDP3HfPXfnVub/ISgN6ZuHCheWOCEA7qaqJkg9j9OjR2WabbfLaa6/llltuSc+ePbPLLru85/1jx47NmDFjWs7r6+uVJQBAVWpubs6NZ52Sv997S77wv79J/8H//c8801/4W5Kk94Dll3a8NvWpbbbP7fc92uraMV89LMPXXCtHHnNcunTpUqZkALS3qipKVl999XTr1i0PPPBAhg5dNOr51ltv5dlnn80222yz2M/ZcsstM2TIkPzud7/LTTfdlL322ivdunV7z/eora39QM90BgDo6G74xbg8efv12e/kc9K9Z6/MfvOfSZIevfqkW22PvDl1cp68/foM32yb9OzTLzNempQJ530/q6z/iQxebe0yp18yvfv0yTrrfqzVtWV69Ur/Acu+6zpApapJx1o2UqlZq6oo6d27dw499NAcf/zxWXbZZTNw4MCccMIJqal5/9/+/fffP+eee26effbZ3H777e2UFgCgsj38p8uSJBcff0Cr65/7xunZeOc90qVrt7z42L25/+pLMn/e3NQtv0LW2Wpktt7vq+WICwBtoqqKkiT53//93zQ0NGS33XZLnz598o1vfCOzZs16388ZPXp0TjvttKyyyir55Cc/2U5JAQAq28kTnn3f1+sGrpBDfvTbdkrT/q6+4dZyRwCgDKquKOndu3cuvfTSXHrppS3Xjj/++JZfv/zyy+/6nHXWWecD7Z4LAAAAlaraHw/cXip1SRAAAABAu1OUAAAAABQUJQAAAACFqtujBAAAADqjUnF0FJWa1UQJAAAAQEFRAgAAAFCw9AYAAACqgMcDtw0TJQAAAAAFRQkAAABAQVECAAAAULBHCQAAAFSBmnSsaYhKzVqpuQAAAADanaIEAAAAoKAoAQAAACjYowQAAACqQKlUSqlUKneMD6xSs5ooAQAAACgoSgAAAAAKlt4AAABAFSgVR0dRqVlNlAAAAAAUFCUAAAAABUUJAAAAQMEeJQAAVeK7Ow8vdwQAyqhUWnR0FJWa1UQJAAAAQEFRAgAAAFCw9AYAAACqQE1KqanYh+6+W6VmNVECAAAAUFCUAAAAABQUJQAAAAAFe5QAAABAFfB44LZhogQAAACgoCgBAAAAKFh6AwAAAFWgVHx0FJWa1UQJAAAAQEFRAgAAAFBQlAAAAAAU7FECAAAAVcDjgduGiRIAAACAgqIEAAAAoKAoAQAAACjYowQAAACqQCml1KRCN/5YjFKFZjVRAgAAAFBQlAAAAAAULL0BAACAKuDxwG3DRAkAAABAQVECAAAAUFCUAAAAABQ6fFHS1NSU8ePHZ9iwYenZs2c23HDD/P73v295/cYbb8yaa66Znj17ZrvttsvFF1+cUqmUmTNnpr6+Pj179sxNN93U6mteffXV6dOnT+bOndve3w4AAAB8KP/ao6QjHZWowxcl48ePz69//euce+65efrpp3PsscfmgAMOyJ133pkpU6Zkjz32yG677ZaJEyfmS1/6Ur71rW+1fG7fvn2z66675rLLLmv1NX/7299m9913zzLLLNPe3w4AAABQRh36qTeNjY35/ve/n1tvvTUjRoxIkqy22mq55557ct5552XVVVfN6quvnh//+MdJkrXWWitPPvlkfvCDH7R8jdGjR+fAAw/M3Llzs8wyy6S+vj433HBDrr766vd8z8bGxpbz+vr6pfgdAgAAAO2pQxclzz//fObOnZuddtqp1fX58+dn4403zttvv53NN9+81Wv/KlT+5dOf/nS6deuW6667Lvvuu2/+8Ic/pG/fvtlxxx0X+57jx4/PuHHj2vYbAQAAgI+oVHx0FJWatUMvvWloaEiS3HDDDZk4cWLL8cwzz7Tap+T9dO/ePZ///Odblt9cdtll2WeffdK16+I7pLFjx2bWrFktx5QpU9rmmwEAAADKrkNPlKy77rqpra3N5MmTs80227zr9XXWWSfXXXddq2v333//u+4bPXp0dtpppzz99NP5y1/+ku9973vv+Z61tbWpra396OEBAACAitOhi5I+ffrkuOOOy7HHHpumpqZstdVWmTVrVv7617+mb9+++cpXvpIf//jHOf744/OlL30pjzzySC6++OJ3fZ2tt946gwcPzujRozNs2LB3LdcBAAAAOocOvfQmSU499dSceOKJGT9+fNZZZ53ssssuueGGGzJs2LAMHTo0f/jDH3LNNddkww03zLnnnpvvf//77/oapVIp++23Xx5//PGMHj26DN8FAAAAfDQ1pY53VKJSc3Nzc7lDtKc77rgj2223Xd56663069fvI3+9+vr61NXVZdasWenbt+9HDwgAAECbq+a/u/3re7v2oRfTq3efcsf5wOY0zM7nPrFaxf0z6fATJQAAAABtpUPvUQIAAAAs4vHAbaPTFSXbbrttOtlqIwAAAOADsvQGAAAAoKAoAQAAACh0uqU3AAAAUI1KpUVHR1GpWU2UAAAAABQUJQAAAAAFRQkAAABAwR4lAAAAUAVKSUqp0I0/FqNSk5ooAQAAACgoSgAAAAAKlt4AAABAFagpLTo6ikrNaqIEAAAAoKAoAQAAACgoSgAAAAAK9igBAACAKlAqPjqKSs1qogQAAACgoCgBAAAAKFh6AwAAAFWgVFp0dBSVmtVECQAAAEDBRAkAAJTRCTc9W+4IS81po9YsdwQ+hJlvLyx3hKWivkq/L9qeiRIAAACAgokSAAAAqAKl4ugoKjWriRIAAACAgqIEAAAAoGDpDQAAAFSBmpRSU6nP3F2MmgpdfGOiBAAAAKCgKAEAAAAoKEoAAAAACvYoAQAAgCrg8cBtw0QJAAAAQEFRAgAAAFBQlAAAAAAU7FECAAAA1cAmJW3CRAkAAABAQVECAAAAULD0BgAAAKpAqfjoKCo1q4kSAAAAgIKiBAAAAKCgKAEAAAAo2KMEAAAAqkEpKVXmth+LV6FZTZQAAAAAFDrURMkdd9yR7bbbLm+99Vb69etX7jgAAFBx7v3deZl075/zxqsvpmv3Hll5nY2z3RePy7Irr9bqvlf/9ljuvOSMTJ30REo1NRm02jrZ93sXpFttjzIlp7P4yf+enj9de02ee/bv6dGzZzbbfERO/t74DF9zrXJHgyQdrCjZcsstM23atNTV1ZU7CgAAVKTJTz2YTXYdnRXWXD9NCxfmjkt+kstPODSHn3dDuvdYJsmikuR3J34pI/b+cnY+4sTUdOmSGS/+PaUaA+csfffefVe+9OUjsvEmm+add97Jqd/9TvbYbVTuf/TJ9OrVq9zxOrRSKnY1y2JVatYOVZR07949gwcPLncMAACoWPueekGr813HnJ6f7jci0597OkPX/0SS5NZfjs+mnz0wW+59eMt9/zlxAkvL76+7sdX52b+8MMNXWSETH3skn9xq6zKlgv9T1sp42223zdFHH51jjjkm/fv3z6BBg3L++ednzpw5OeSQQ9KnT5+sscYauemmm5IsWnpTKpUyc+bMJMnFF1+cfv36ZcKECVlnnXXSu3fv7LLLLpk2bVqr9zjmmGNave/uu++eL3zhCy3nZ599doYPH54ePXpk0KBB+fznP7+0v3UAAGgXjXNmJ0l69Fk0lT1n5huZOunxLNNv2VzyjX1z5v5b5tJvHpApTz9czph0YvX1s5Ik/fsPKHMSWKTss3WXXHJJlltuuTz44IM5+uijc8QRR2SvvfbKlltumUcffTQ777xzDjzwwMydO3exnz937tz86Ec/yqWXXpq77rorkydPznHHHfeB3//hhx/O1772tZxyyimZNGlSbr755my99Xu3mI2Njamvr291AABAJWpuasqt530/K6/78Qxcdc0kyczpU5Ik9/z2F9lo5F7Z99RfZfAa6+aysV/Im/94uYxp6Yyampoy9vgx2XzElll3vY+VOw4kqYCiZMMNN8x3vvOdDB8+PGPHjk2PHj2y3HLL5bDDDsvw4cNz0kkn5Y033sgTTzyx2M9fsGBBzj333Gy66ab5+Mc/nqOOOiq33XbbB37/yZMnp1evXtl1112zyiqrZOONN87Xvva197x//PjxqaurazmGDBmyxN8zAAC0h5vPHpd/vvJcdv/WGS3XmpuakiQbj9onG+68Zwavvm52OvzbGbDysDz+5z+UKyqd1HHHHJ2/PfN0LrjksnJHqQ6lDnhUoLIXJRtssEHLr7t06ZJll10266+/fsu1QYMGJUlee+21xX7+Msssk9VXX73lfIUVVnjPexdnp512yiqrrJLVVlstBx54YH7729++5/RKkowdOzazZs1qOaZMmfKB3wsAANrLhLNPyfMP3pHRp1+Svsv93z5/vQcsnyRZbujqre5fbsjqqf/n1HbNSOd2/LFfy4Sbbsj1N9+alVZeudxxoEXZi5Ju3bq1Oi+VSq2ulUqLKqamovn+IJ/f3Nzccl5TU9PqPFk0hfIvffr0yaOPPprLL788K6ywQk466aRsuOGGLfug/Kfa2tr07du31QEAAJWiubk5E84+JZPuuyWjx1+SfoNbT0DXDVo5vZcdmDdefanV9Tf/8XLqBq7UnlHppJqbm3P8sV/LDdddk+tuuiWrrDqs3JGglbIXJUvb8ssv32pz14ULF+app55qdU/Xrl2z44475oc//GGeeOKJvPzyy/nLX/7S3lEBAOAjm3D2uDx1+3X53Dd/nO49e6XhzX+m4c1/ZkHjvCSLfrC4xZ6H5uHrLs3f7rk5b059JXf++sy88eqL2XCkhxqw9B13zNG58orf5vyLL03v3n0yY/r0zJg+PW+//Xa5o3V4pQ74UYk61OOBP4ztt98+Y8aMyQ033JDVV189P/nJT1pNi/zpT3/Kiy++mK233jr9+/fPjTfemKampqy11lrlCw0AAB/SozdcniT57f87sNX1XY8dnw122iNJstnuX8g78+fn1l+Oz7zZszJwtbWz32kXpv8KQ9s9L53PheefmyTZdeQOra6fdd4F2f/Ag8sRCVqp+qLki1/8Yh5//PEcdNBB6dq1a4499thst912La/369cvf/zjH3PyySdn3rx5GT58eC6//PKst956ZUwNAAAfzrdvnPSB7tty78Oz5d6HL+U08G5vzX2n3BHgfZWa/3MDD5ZIfX196urqMmvWLPuVAACwxE646dlyR1hqThu1Zrkj8CHMfHthuSMsFfX19Vll8ICq/Lvbv/5eevvjU9K7T8f53hpm12e7DYdU3D+Tqp8oAQAAgM6gVFp0dBSVmrXqN3MFAAAA+KAUJQAAAAAFRQkAAABAwR4lAAAAUAVKxdFRVGpWEyUAAAAABUUJAAAAQMHSGwAAAKgG1t60CRMlAAAAAAVFCQAAAEBBUQIAAABQsEcJAAAAVIFS8dFRVGpWEyUAAAAABUUJAAAAQMHSGwAAAKgCpdKio6Oo1KwmSgAAAAAKihIAAACAgqIEAAAAoGCPEgAAAKgCpeLoKCo1q4kSAAAAgIKiBAAAAKBg6Q0A0Kk0NDaVO8JS07vWz8A6ojFbr1buCEvNCTc9W+4IS81po9Ysd4Slpl/PLuWOsFTULKjO76sVa2/ahP+aAgAAABQUJQAAAAAFRQkAAABAwR4lAAAAUAVKxUdHUalZTZQAAAAAFBQlAAAAAAVFCQAAAEDBHiUAAABQBUqlRUdHUalZTZQAAAAAFBQlAAAAAAVLbwAAAKAKlIqjo6jUrCZKAAAAAAqKEgAAAICCogQAAACgoCgBAACAalDqgMcSuOuuu7LbbrtlxRVXTKlUyjXXXNPq9ebm5px00klZYYUV0rNnz+y444557rnnluxNoigBAAAAOoA5c+Zkww03zFlnnbXY13/4wx/mZz/7Wc4999w88MAD6dWrV0aOHJl58+Yt0ft46g0AAABQ8UaNGpVRo0Yt9rXm5uaceeaZ+c53vpPPfe5zSZJf//rXGTRoUK655prsu+++H/h9TJQAAABAFSh1wI8kqa+vb3U0NjYu8ff+0ksvZfr06dlxxx1brtXV1WXzzTfPfffdt0RfS1ECAAAAlM2QIUNSV1fXcowfP36Jv8b06dOTJIMGDWp1fdCgQS2vfVCW3gAAAABlM2XKlPTt27flvLa2toxpTJQAAAAAZdS3b99Wx4cpSgYPHpwkmTFjRqvrM2bMaHntg+q0RcnJJ5+cjTbaqNwxAIAqdM/dd2WvPT6b4cNWTp8eXXL9ddeUOxKd3IXnn5utN984q64wIKuuMCC7bL9Vbv3zzeWOtcTu/d15uejre+ZHe26cM/cbkd+f8tW88eqL77rv1b89lt9+66D87/9slB/t+fFcevzoLGhcsqdeQEdUKnW8o60MGzYsgwcPzm233dZyrb6+Pg888EBGjBixRF/L0pvCF77whcycOfNdz2EGAFhSc+fOyfrrb5gDDz4ko/f5fLnjQFZcaeWceMr3s9rqa6S5uTm/++2lOXCfPXL7Xx/K2uuuV+54H9jkpx7MJruOzgprrp+mhQtzxyU/yeUnHJrDz7sh3Xssk2RRSfK7E7+UEXt/OTsfcWJqunTJjBf/nlJNp/0ZMVSNhoaGPP/88y3nL730UiZOnJgBAwZk6NChOeaYY/K9730vw4cPz7Bhw3LiiSdmxRVXzO67775E76MoAQBoYzuPHJWdRy7+8YVQDrt8etdW5yecfGouuuC8PPzQAx2qKNn31Atane865vT8dL8Rmf7c0xm6/ieSJLf+cnw2/eyB2XLvw1vuW3bl1do1J7B0PPzww9luu+1azseMGZMkOfjgg3PxxRfnm9/8ZubMmZPDDz88M2fOzFZbbZWbb745PXr0WKL36RC16rbbbpujjz46xxxzTPr3759Bgwbl/PPPz5w5c3LIIYekT58+WWONNXLTTTclSS6++OL069ev1de45pprUnqPuZ6TTz45l1xySa699tqUSqWUSqXccccdS/m7AgCA9rdw4cL88arfZe6cOfnEZluUO85H0jhndpKkR5+6JMmcmW9k6qTHs0y/ZXPJN/bNmftvmUu/eUCmPP1wOWNCuyl1wGNJbLvttmlubn7XcfHFFy/6/kulnHLKKZk+fXrmzZuXW2+9NWuuueYSvksHKUqS5JJLLslyyy2XBx98MEcffXSOOOKI7LXXXtlyyy3z6KOPZuedd86BBx6YuXPnLvHXPu6447L33ntnl112ybRp0zJt2rRsueWWi723sbHxXc94BgCASvfMU09mlUH9suKAXjnumCNzyeW/z1rrrFvuWB9ac1NTbj3v+1l53Y9n4KqL/iI0c/qUJMk9v/1FNhq5V/Y99VcZvMa6uWzsF/LmP14uY1qgI+kwRcmGG26Y73znOxk+fHjGjh2bHj16ZLnllsthhx2W4cOH56STTsobb7yRJ554Yom/du/evdOzZ8/U1tZm8ODBGTx4cLp3777Ye8ePH9/q+c5Dhgz5qN8aAAAsdWusuVZuv/fhTLjjrznkS1/OUYd/MZP+9ky5Y31oN589Lv985bns/q0zWq41NzUlSTYetU823HnPDF593ex0+LczYOVhefzPfyhXVKCD6TBFyQYbbNDy6y5dumTZZZfN+uuv33Jt0KBBSZLXXnttqeYYO3ZsZs2a1XJMmTJlqb4fAAC0he7du2e11dfIRhtvkhPHnZb11t8g553983LH+lAmnH1Knn/wjow+/ZL0Xe7/HvvZe8DySZLlhq7e6v7lhqye+n9ObdeMQMfVYTZz7datW6vzUqnU6tq/9h9pampKTU1NmpubW92/YMGCNslRW1v7oZ7pDAAAlaSpqSnz5zeWO8YSaW5uzp/POTWT7rslB5x+afoNbj3dXTdo5fRedmDeePWlVtff/MfLWX3TrdszKpTHh9n4o5wqNGuHKUqWxPLLL5/Zs2dnzpw56dWrV5Jk4sSJ7/s53bt3z8KFC9shHQBQ7RoaGvLiC//3+MJXXn45Tzw+Mf37D8iQoUPLmIzO6tTvnpAddtolKw8ZkobZs/OHq67IX+++M1dde2O5oy2RCWePy9N3/CmfP+nsdO/ZKw1v/jNJUturT7rV9kipVMoWex6au3/z8wxcbe0MWm2dPHnr1Xnj1Rezxwk/K3N6oKOoyqJk8803zzLLLJNvf/vb+drXvpYHHnigZRfc97LqqqtmwoQJmTRpUpZddtnU1dW9a4oFAOCDeOyRh/PpkTu0nI/95jeSJPsfcFDO+9VF5YpFJ/b6P1/LkYcfkhnTp6Vv37qs+7H1c9W1N2bb7Xcsd7Ql8ugNlydJfvv/Dmx1fddjx2eDnfZIkmy2+xfyzvz5ufWX4zNv9qwMXG3t7Hfahem/gpIS+GCqsigZMGBAfvOb3+T444/P+eefnx122CEnn3xyDj/88Pf8nMMOOyx33HFHNt100zQ0NOT222/Ptttu236hAYCq8altts3seSZVqRw/Pfv8ckdoE9++cdIHum/LvQ/Plnu/95/9Ad5Pqfk/N/NgidTX16euri6zZs1K3759yx0HAPgvGhqbyh1hqeld22H26effvDHnnXJHWGp+cteL5Y6w1Jw2as1yR2AJVfPf3f71vT00aVp69+k431vD7Pp8Yq0VKu6fif+aAgAAABQUJQAAAACFqtyjBAAAADqbUmnR0VFUalYTJQAAAAAFRQkAAABAQVECAAAAULBHCQAAAFSBUnF0FJWa1UQJAAAAQEFRAgAAAFCw9AYAAACqgbU3bcJECQAAAEBBUQIAAABQUJQAAAAAFOxRAgAAAFWgVHx0FJWa1UQJAAAAQEFRAgAAAFBQlAAAAAAU7FECAAAA1aCUlCpz24/Fq9CsJkoAAAAACooSAAAAgIKlNwAAAFAFSqnY1SyLValZFSVAh/LGnHfKHWGpWbaX/0uG9tC71kAtlaWa////tFFrljvCUrP/pRPLHWGpuezAjcodAcrKnxQAAAAACooSAAAAgEL1zvkBAABAZ2KTkjZhogQAAACgoCgBAAAAKFh6AwAAAFWgVHx0FJWa1UQJAAAAQEFRAgAAAFBQlAAAAAAU7FECAAAAVaBUWnR0FJWa1UQJAAAAQEFRAgAAAFCw9AYAAACqQKk4OopKzWqiBAAAAKCgKAEAAAAoKEoAAAAACvYoAQAAgGpgk5I2YaIEAAAAoKAoAQAAACgoSgAAAAAK9igBAACAKlAqPjqKSs1adUXJqquummOOOSbHHHNMuaMAHdCF55+bi391XiZPfiVJsvY66+a4b30nO+68S5mTAQDVbMc1l82Oay6X5Xp1T5L8Y9a8/PGJ6Xl86uwkyfbDl82Wq/bPqgN6ZpnuXfKlK57M3AULyxkZqlbVFSUAH8WKK62cE0/5flZbfY00Nzfnd7+9NAfus0du/+tDWXvd9codDwCoUm/OXZArHp2a6bMbk5Sy9er9841th2XsDc/mH7PmpXuXmjw+tT6PT63Pfh9fsdxxoaopSgD+zS6f3rXV+Qknn5qLLjgvDz/0gKIEAFhqHn21vtX5lROnZ8c1l8vw5ZfJP2bNy81//2eSZJ1BvcsRjw6ilKRUmatZFqtSo3a4zVy33XbbHHXUUTnqqKNSV1eX5ZZbLieeeGKam5vfde/LL7+cUqmUiRMntlybOXNmSqVS7rjjjiTJW2+9ldGjR2f55ZdPz549M3z48Fx00UXt9N0AlWzhwoX541W/y9w5c/KJzbYodxwAoJMolZIRq/ZLbdeaPPfPOeWOA51Oh5woueSSS3LooYfmwQcfzMMPP5zDDz88Q4cOzWGHHbbEX+vEE0/MM888k5tuuinLLbdcnn/++bz99tvveX9jY2MaGxtbzuvr69/zXqBjeuapJzNqh09l3rx56dW7dy65/PdZa511yx0LAKhyQ/r1yLhdhqdbl5rMe6cpZ9zxUv4xq/G/fyLQpjpkUTJkyJCcccYZKZVKWWuttfLkk0/mjDPO+FBFyeTJk7Pxxhtn0003TbJoM9j3M378+IwbN+7DxAY6iDXWXCu33/tw6utn5fpr/pijDv9irrv5NmUJALBUTa1vzNgbJmWZbl2y2Sr98pVPrpJT//ycsgTaWYdbepMkW2yxRUr/tvBqxIgRee6557Jw4ZLv+nzEEUfkiiuuyEYbbZRvfvObuffee9/3/rFjx2bWrFktx5QpU5b4PYHK1r1796y2+hrZaONNcuK407Le+hvkvLN/Xu5YAECVW9jUnBmz5+elN9/O7x6blslvvZ1d1l6+3LHoQEod8KhEHbIo+aBqahZ9e/++f8mCBQta3TNq1Ki88sorOfbYYzN16tTssMMOOe64497za9bW1qZv376tDqC6NTU1Zf58P8kBANpXqZR07VLVf2WDitQh/6174IEHWp3ff//9GT58eLp06dLq+vLLL2pfp02b1nLt3zd2/ff7Dj744PzmN7/JmWeemV/+8pdtHxroEE797gm59567M/mVl/PMU0/m1O+ekL/efWc+v8/+5Y4GAFSxfTZeIWsP7JXlenXPkH49ss/GK2SdQb3z15feTJLU9eiaVfr3zKA+3ZMkQ/r3yCr9e6ZX9y7v92WBD6FD7lEyefLkjBkzJl/+8pfz6KOP5uc//3l+/OMfv+u+nj17Zosttsjpp5+eYcOG5bXXXst3vvOdVvecdNJJ2WSTTbLeeuulsbExf/rTn7LOOuu017cCVJjX//lajjz8kMyYPi19+9Zl3Y+tn6uuvTHbbr9juaMBAFWsb4+uOeKTq6Rfz66Zu2Bhprw1L6ff9kKemtaQJNlxzeWy54aDW+7/7sjhSZJz/zo5d734ZlkyU3lKpQ72eOAKzdohi5KDDjoob7/9djbbbLN06dIlX//613P44Ycv9t4LL7wwhx56aDbZZJOstdZa+eEPf5idd9655fXu3btn7Nixefnll9OzZ8986lOfyhVXXNFe3wpQYX569vnljgAAdELn3/f+ex/+4Ynp+cMT09spDXRuHbIo6datW84888ycc84573rt5ZdfbnW+zjrrvGuD1n/fs+Q73/nOu6ZMAAAAgM6pQ+5RAgAAALA0dMiJEgAAAOA/VfJDdxenMrN2uKLkjjvuKHcEAAAAoEpZegMAAABQ6HATJQAAAMC7eTxw2zBRAgAAAFBQlAAAAAAUFCUAAAAABXuUAAAAQBXwcOC2YaIEAAAAoKAoAQAAACgoSgAAAAAK9igBAACAKlAqLTo6ikrNaqIEAAAAoKAoAQAAAChYegMAAABVoFR8dBSVmtVECQAAAEBBUQIAAABQUJQAAAAAFOxRAgAAANWgVBwdRYVmNVECAAAAUDBRAnQoy/byf1sAVJeZby8sd4Slpl/PLuWOsNRcduBG5Y4ALCX+xgEAAABVwMqbtmHpDQAAAEBBUQIAAABQUJQAAAAAFOxRAgAAAFWgVFp0dBSVmtVECQAAAEBBUQIAAABQsPQGAAAAqkCp+OgoKjWriRIAAACAgqIEAAAAoKAoAQAAACjYowQAAACqQak4OooKzWqiBAAAAKCgKAEAAAAoKEoAAAAACvYoAQAAgCpgi5K2YaIEAAAAoKAoAQAAAChYegMAAABVoFRadHQUlZrVRAkAAABAQVECAAAAUFCUAABAlfvJ/56e7bfaIkMG9svwVVbI6L33yHPPTip3LICK1CmKki984QsplUrvOp5//vlWr3Xv3j1rrLFGTjnllLzzzjvljg0AAG3i3rvvype+fET+fMdf88frb86CBQuyx26jMmfOnHJHA9pUqUN9VOoDgjvNZq677LJLLrroolbXll9++VavNTY25sYbb8yRRx6Zbt26ZezYseWICgAAber3193Y6vzsX16Y4auskImPPZJPbrV1mVIBVKZOMVGSJLW1tRk8eHCro0uXLq1eW2WVVXLEEUdkxx13zHXXXVfmxAAAsHTU189KkvTvP6DMSQAqT6eZKFkSPXv2zBtvvLHY1xobG9PY2NhyXl9f316xAADgI2tqasrY48dk8xFbZt31PlbuOEAb8njgttFpJkr+9Kc/pXfv3i3HXnvt9a57mpubc+utt2bChAnZfvvtF/t1xo8fn7q6upZjyJAhSzs6AAC0meOOOTp/e+bpXHDJZeWOAlCROs1EyXbbbZdzzjmn5bxXr14tv/5XibJgwYI0NTVl//33z8knn7zYrzN27NiMGTOm5by+vl5ZAgBAh3D8sV/LhJtuyI233J6VVl653HEAKlKnKUp69eqVNdZYY7Gv/atE6d69e1ZcccV07frevy21tbWpra1dWjEBAKDNNTc355tjvp4brrsm10+4LausOqzckQAqVqcpSt7P+5UoAADQ0R13zNH5/ZWX57Ir/5jevftkxvTpSZK+dXXp2bNnmdMBVBZFCQAAVLkLzz83SbLryB1aXT/rvAuy/4EHlyMSQMVSlAAAQJV7a+475Y4A0GF0iqLk4osv/lCvAQAAQEfh8cBto9M8HhgAAADgv1GUAAAAABQUJQAAAACFTrFHCQAAAFS7UvHRUVRqVhMlAAAAAAVFCQAAAEBBUQIAAABQsEcJAADw/9u786go6/0P4O9nAGHYxViE2NzRq2IqSHpRSFM7xxRzuWUCuR0VITE07F71mhvp1bpoFy03NDG8Vpa5oGmueS0RNBdwQYUSETUWBWKZ7+8PYX6NqCkz8PgM75dnzvF5vg8z7++wDHzmuxCREZCk+zeleFazckQJEREREREREVE1FkqIiIiIiIiIiKpx6g0RERERERGREZCqb0rxrGbliBIiIiIiIiIiomoslBARERERERERVWOhhIiIiIiIiIioGtcoISIiIiIiIjIGXKTEIDiihIiIiIiIiIioGgslRERERERERETVOPWGiIiIiIiIyAhI1f+U4lnNyhElRERERERERETVWCghIiIiIiIiIqrGqTd6EkIAAIqKimROQkRERERKVFRaJXeEeqOqMJE7ApFWzd9sNX/DET0KCyV6Ki4uBgC4u7vLnISIiIiIiIj+THFxMezs7OSOUS8k6f5NKZ7VrCyU6MnV1RU5OTmwsbGB1ACf5aKiIri7uyMnJwe2trb1/ngNiX1TJmPtm7H2C2DflIp9Uyb2TZnYN2Vi35SpIfsmhEBxcTFcXV3r9XFI+Vgo0ZNKpcLzzz/f4I9ra2trdD8ka7BvymSsfTPWfgHsm1Kxb8rEvikT+6ZM7JsyNVTfjHUkCRkWCyVERERERERERkCqvinFs5qVu94QEREREREREVVjoURhzM3NMWfOHJibm8sdxeDYN2Uy1r4Za78A9k2p2DdlYt+UiX1TJvZNmYy5b6RckuDeSERERERERESKVVRUBDs7O+TmFyhqHZuioiI0d7RHYWHhM5Wba5QQERERERERGQMuUmIQnHpDRERERERERFSNhRIiIiIiIiIiomoslBARERERERERVeMaJUTUqAkhIEnP6ORIIiIyqLKyMlhYWMgdg4io3kjV/5TiWc3KESXPgFu3bmHx4sUICQlBQEAAAgICEBISgiVLliA/P1/ueERGzdzcHOfPn5c7BhEpSG5uLmbPno3g4GD4+PigQ4cOGDRoENasWYOqqiq54+nts88+w7179+SOUS+cnJwQHh6OvXv3QqPRyB2HntChQ4dQWVlZ63xlZSUOHTokQyIiMnYslMjsp59+Qps2bRAfHw87OzsEBgYiMDAQdnZ2iI+PR7t27XDixAm5Y9aLnJwcjBkzRu4YdVZaWoojR47g3LlztdrKysqwYcMGGVIZxvnz57Fu3TpkZGQAADIyMjBp0iSMGTMG+/fvlzld3UybNu2ht6qqKsTFxWmPjcG9e/ewbt06/P3vf8eKFStw+/ZtuSPV2cmTJ3HlyhXt8caNG9GzZ0+4u7ujV69e+Pzzz2VMp5/IyEgcPnxY7hj1ZsWKFQgNDdV+jjZu3Ij27dujXbt2eO+99x76R48SnDhxAj4+Pti5cycqKipw8eJFdO3aFVZWVoiJiUFgYCCKi4vljqmX6OhoODs744033sDOnTuNovhTIzExEffu3cPgwYPh5uaGqVOnGu3vWUVFRdi2bZtRvBkQFBSEO3fu1DpfWFiIoKAgGRIZTmJiInbs2KE9njFjBuzt7fHiiy/i2rVrMiYjauQEycrf319MmDBBaDSaWm0ajUZMmDBB9OjRQ4Zk9S89PV2oVCq5Y9RJZmam8PT0FJIkCZVKJQIDA8X169e17Tdu3FBs33bt2iWaNGkiHBwchIWFhdi1a5dwdHQUffv2FcHBwcLExETs27dP7phPTZIk4evrK/r06aNzkyRJdO/eXfTp00cEBQXJHbNOfHx8xO3bt4UQQmRnZwsvLy9hZ2cnunfvLhwcHISTk5PIysqSOWXddOrUSezdu1cIIcSnn34q1Gq1iIqKEgkJCWLq1KnC2tparFmzRuaUdVPz86N169YiLi5O5Obmyh3JYObNmydsbGzEa6+9JlxcXERcXJxo1qyZmD9/vli4cKFwdHQUs2fPljtmnfTs2VP885//1B5v3LhR+Pv7CyGEuHPnjvD19RVRUVFyxTOIiooKsX37dvHGG28IKysr4ejoKCZPniyOHj0qdzSDKSoqEmvXrhX9+vUTJiYmonXr1mLu3Llyx9LL8OHDxfLly4UQQpSUlIjWrVsLMzMzYWpqKrZu3SpzOv1IkiRu3rxZ63xmZqawsbGRIZHhtGnTRvt71Q8//CAsLS3FqlWrxKBBg0RISIjM6eqm5vXtcTcTExO5YxqdwsJCAUDk3S4UpRVCMbe82/dzFxYWPlV/V6xYITw9PYW5ubnw8/MTx48fN+jzKQkhhNzFmsZMrVYjLS0N7dq1e2h7RkYGunTpgtLS0gZOpr9vvvnmse1ZWVl45513FPlOVUhICCoqKrB+/XoUFBRg6tSpOHfuHA4cOAAPDw/k5eXB1dVVkX178cUXERwcjPnz5+Pzzz/H5MmTMWnSJCxYsAAAMHPmTKSmpmLPnj0yJ306cXFx+OSTT7B69WoEBwdrz5uZmeHUqVNo3769jOn0o1KpcOPGDTg5OeHNN9/ElStXsHPnTtjZ2eHu3bsICQmBo6MjkpKS5I761CwtLXH+/Hl4enrihRdewKRJkzB+/Hhte1JSEhYsWICzZ8/KmLJuVCoV9u7di+3bt2PTpk0oLCzEwIEDMX78eLzyyitQqZQ76LNVq1ZYvHgxhg4dilOnTqFr165ITEzEqFGjAABfffUVZsyYgYsXL8qc9OlZWlrizJkzaNGiBQBAo9HAwsICOTk5cHZ2xt69exEeHo5ff/1V5qSGUVJSgq+++gpJSUn47rvv8Pzzz+Py5ctyxzKoc+fOYdSoUTh9+rQiX7druLi4ICUlBZ07d0ZSUhLmzJmDU6dOITExEZ988gnS0tLkjvjUhg4dCgD4+uuvMWDAAJibm2vbqqqqcPr0abRt2xa7d++WK6LeLC0tkZGRAQ8PD7z77rvIzc3Fhg0bcPbsWfTp00eR0/C//vrrR7YdO3YM8fHx0Gg0KCsra8BUxq+oqAh2dnbIu10IW1tbueM8saKiIjg3s0Nh4ZPnTk5ORmhoKFauXAl/f3989NFH+O9//4vMzEw4OTkZJphByy701Ly8vERiYuIj2xMTE4Wnp2fDBTKgmmqyJEmPvCl11IWTk5M4ffq09lij0YiJEycKDw8PcfnyZUWPKLG1tRUXL14UQghRVVUlTE1NxcmTJ7XtP//8s3B2dpYrnl5+/PFH0aZNG/HOO++I8vJyIYQQpqam4uzZszIn048kSSIvL08IIUSLFi3Enj17dNqPHj0q3N3d5Yimt2bNmokTJ04IIe5/36Wnp+u0X7p0SajVajmi6e2Pn7fy8nKRnJws+vfvL0xMTISrq6t47733tN+LSqNWq8W1a9e0x2ZmZuLMmTPa46tXrwpLS0s5ounN09NTHDlyRHt8/fp1IUmSKCkpEUIIceXKFWFhYSFXvHqRn58vli9fLjp06KDY17YHlZaWiuTkZDF48GBhbm4uPDw8xLvvvit3LL1YWFiI7OxsIYQQo0eP1vbn2rVrwsrKSs5odRYeHi7Cw8OFJEli5MiR2uPw8HAxYcIEsXDhQpGfny93TL04Ojpqf8/y9fUVGzZsEELcf31T6uftYTIyMsSQIUOEiYmJCA0NFVevXpU7ktFpTCNK/Pz8REREhPa4qqpKuLq6ikWLFhns+eSuNzKLiYnBhAkTkJqaipdeegnOzs4AgLy8POzbtw+ffvop/vWvf8mcsm6aN2+O//znPxg8ePBD29PT09G1a9cGTmUYpaWlMDX9/28fSZKQkJCAKVOmoHfv3op85/6PanaBUalUsLCwgJ2dnbbNxsYGhYWFckXTS/fu3ZGamoqIiAh069YNmzZtMpodb2r6UVZWhubNm+u0ubm5KfIdKQAYOHAgEhISsHr1avTu3Rtbt25F586dte1btmxBq1atZExoGGZmZhgxYgRGjBiB7OxsrF27FuvXr0dcXJwi3+F2cXHBuXPn4OHhgYsXL6Kqqgrnzp1Dhw4dAABnz5413Ds+DWzIkCGYOHEilixZAnNzc8ybNw+9e/eGWq0GAGRmZsLNzU3mlPqrGUmyadMm7Nu3D+7u7nj99dexdetWuaPpJSUlBUlJSdi2bRtMTU0xbNgw7NmzB4GBgXJH05u7uzuOHTsGBwcH7N69W7s+0G+//abYnX7WrVsHAPDy8kJMTAysrKxkTmR4/fr1w7hx49ClSxdcuHABr7zyCoD7Pye9vLzkDWcA169fx5w5c5CYmIj+/fsjPT0df/nLX+SOZdSKiorkjvBUavI+mNvc3FxnFFmN8vJypKamYubMmdpzKpUKffv2xbFjxwwXzGAlF6qzzz//XPj7+wtTU1PtSAtTU1Ph7+8vkpOT5Y5XZ4MGDRKzZs16ZHt6erqQJKkBExlO9+7dtRX/B0VERAh7e3vFvuvWqVMnsWvXLu3xzz//LCoqKrTHhw4dEt7e3nJEM6jNmzcLZ2dnoVKpjGJESceOHUWXLl2EtbV1rbnoBw8eFG5ubjKl08+vv/4qvLy8RGBgoJg2bZpQq9WiV69eYvz48SIwMFA0adJE7NixQ+6YdfLHESUPo9Foao0OUop//OMfwtHRUYwbN054e3uL2NhY4eHhIRISEsTKlSuFu7u7iI6OljtmnRQXF4sRI0ZoX7NffPFFnTWAUlJSxJYtW2RMqL+RI0dq1yaJiIgQP/zwg9yRDEatVovhw4eLbdu2aUcWGouPP/5YmJqaCnt7e9G5c2dRVVUlhBAiPj5e9OnTR+Z09Ci//fabiIiIEK+++qrO71+zZ88W8+fPlzGZfgoKCsSMGTOEWq0WAQEB4tChQ3JHMnqlpaXCxcVFAFDczdrauta5OXPmPLSfv/76qwBQ67Vp+vTpws/Pz2DPJ0eUPANGjhyJkSNHoqKiArdu3QIAPPfcczAzM5M5mX6mT5/+2O0FW7Vqhe+//74BExlOSEgINm/ejNGjR9dqW7FiBTQaDVauXClDMv1NmjRJ5x3sB6v+u3bt0lnjQ6n+9re/oVevXkhNTYWnp6fccfQyZ84cnWNra2ud4+3bt+Ovf/1rQ0YyGFdXV6SlpSEuLg7bt2+HEAI//vgjcnJy0LNnTxw9ehTdunWTO2adeHp6wsTE5JHtkiShX79+DZjIcObOnQu1Wo1jx45h/PjxiI2NRefOnTFjxgyUlJRg0KBBmDdvntwx68Ta2hrJyckoKytDZWVlre+3l19+WaZkhmNiYoItW7agf//+j/0aVaK8vDzY2NjIHaNeTJ48GX5+fsjJyUG/fv206xy1aNEC8+fPlznd0+vSpcsTj/o8efJkPaepP/b29lixYkWt83PnzpUhjWEsXrwYH3zwAVxcXLB58+ZHji4nw7KwsMCVK1dQXl4ud5SnJoSo9f3+sNEkDYmLuRIRERFRo/A0Q9KVtBiiMXqaQsGDbxgoyaFDhx7brsRpYSqVCmq1Gn379n1ssfXLL79swFRkLMrLy2FpaYmtW7diyJAh2vNhYWEoKCh47GLCT4OFEiIiIqJGLj4+HhMmTICFhQXi4+Mfe21UVFQDpTI8lUr1p6MUat7ZVNr6QGPGjHls+9q1axsoCT2Nh+1w9sevUaV9HQJAeHj4E40GqlmDhuhp+fv7w8/PD8uXLwdwfwc6Dw8PTJkyBbGxsQZ5DBZKiIiIiBo5b29vnDhxAs2aNYO3t/cjr5MkCVlZWQ2YzLASExMRGxuL8PBwBAQEALi/XWliYiIWLVqks3hm7969ZUpZNyEhITrHFRUVOHPmDAoKChAcHMx3759RDy6QX1FRgbS0NMyaNQsLFizASy+9JFOyusvKyoKXl5eit7mnZ1tycjLCwsKwatUq+Pn54aOPPsKWLVuQkZGh3RxFXyyUEBEREVGj8NJLL2HcuHF4/fXXdc4nJSXhk08+wYEDB+QJVk80Gg0mTZqEli1bYsaMGXLHqbM/GwmkxFEXf+bgwYOYNm0aUlNT5Y7y1ExMTJCbm6vd3WzkyJGIj4832B+wRMD9dSGXLFmCGzduwNfXF/Hx8fD39zfY/bNQQkRERNTITZs27YmukyQJS5curec09cfS0hKnTp1C69atdc5fuHABvr6+KCkpkSlZ/cnMzESfPn2Qm5srd5Q6e3DNgZpRF4mJiZg7dy7Gjh0rU7L6k5GRgW7duuHu3btyR3lqKpUKN27c0BZKbGxscOrUKbRo0ULmZERPjrveEBERETVyaWlpOscnT55EZWUl2rZtC+B+IcHExARdu3aVI57BuLu749NPP8XixYt1zq9evRru7u4ypapfly9fRmVlpdwx9PKwXVOGDRuGDh06IDk5WdGFktOnT+scCyGQm5uLuLg4+Pr6yhOKiFgoISIiImrsvv/+e+3/ly1bBhsbGyQmJqJp06YAgN9++w1vvfWWYrcar/Hhhx/itddew65du7RDtH/88UdcuHBB8Wt4PDgqqOYP7h07diAsLEymVPWrR48emDBhgtwx9OLr6wtJkvDgIP8ePXoodgFeSZJqTZV60q2eiZ4VnHpDRERERFpubm7Ys2cPOnTooHP+zJkzePnll3H9+nWZkhnGL7/8goSEBJw/fx4A4OPjg4kTJyp+RElQUJDOsUqlgqOjI4KDgzFmzBiYmhrX+6OlpaWYOXMmdu3ahczMTLnj1Nm1a9d0jms+bxYWFjIl0p9KpcLAgQNhbm4OANi+fTuCg4NhZWWlc53Si5Nk3IzrJyYRERER6aWoqAj5+fm1zufn56O4uFiGRIZ15coVXL16Fbm5udi6dSvc3NywceNGeHt7o1evXnLHq7M/jgoyNk2bNtUZkSCEQHFxMdRqNTZt2iRjMv15enrKHcHgHhzB9Oabb8qUhKjuWCghIiIiIq2QkBC89dZbWLp0Kfz8/AAAx48fx/Tp0zF06FCZ0+nniy++wOjRozFq1CikpaXh999/B3B/i9aFCxdi586dMiekh/nwww91CiU1oy78/f2108OUJD4+/omvjYqKqsck9WPdunVyRyDSG6feEBEREZFWSUkJYmJisHbtWlRUVAAATE1NMXbsWCxZsqTW8Hkl6dKlC6KjoxEaGqqzE0daWhoGDhyIGzduyB2xzvLy8hATE4N9+/bh5s2btda8UPoWumVlZTh9+jRu3rwJjUaj0/bqq6/KlKpuvL29n+g6SZKQlZVVz2mI6GFYKCEiIiKiWu7du4fLly8DAFq2bKnoAkkNS0tLnDt3Dl5eXjqFkqysLLRv3x5lZWVyR6yzgQMHIjs7G1OmTEHz5s1rLZ75sJ1jlGL37t0IDQ3F7du3axWAJElSfBGoRk3fuPApkfw49YaIiIiIarGyskKnTp3kjmFQLi4uuHTpEry8vHTOHzlyBC1atJAnlIEcOXIEhw8fNsotZSMjIzF8+HDMnj0bzs7OcscxuDVr1uDDDz/ExYsXAQCtW7fG1KlTMW7cOJmTETVeLJQQERERUaMwfvx4vP3221i7di0kScL169dx7NgxxMTEYNasWXLH04u7u3ut0RbGIi8vD9OmTTPKIsns2bOxbNkyREZGIiAgAABw7NgxREdHIzs7G++//77MCYkaJ069ISIiIqJGQQiBhQsXYtGiRSgpKQEAmJubIyYmBvPmzZM5nX727NmDpUuXYtWqVbVGzCjdmDFj0LNnT4wdO1buKAbn6OiI+Ph4vP766zrnN2/ejMjISNy6dUumZESNGwslRERERNSolJeX49KlS7h79y7at28Pa2truSPprWnTpigpKUFlZSUsLS1hZmam037nzh2ZkumvpKQEw4cPh6OjIzp27Firb0rcGaaGvb09fvrpJ7Ru3Vrn/IULF+Dn54eCggJ5ghE1ciyUEBEREREpXGJi4mPbw8LCGiiJ4a1ZswYTJ06EhYUFmjVrprPYqdJ3homMjISZmRmWLVumcz4mJgalpaX4+OOPZUpG1LixUEJERERERM8sFxcXREVFITY2FiqVSu44BhUZGYkNGzbA3d0dPXr0AAAcP34c2dnZCA0N1Rk982AxhYjqDwslREREREQKVFRUBFtbW+3/H6fmOiVycHDATz/9hJYtW8odxeCCgoKe6DpJkrB///56TkNENVgoISIiIiJSIBMTE+Tm5sLJyQkqlUpnSkoNIQQkSUJVVZUMCQ0jOjoajo6OeO+99+SOQkSNBLcHJiIiIiJSoP3798PBwQEA8P3338ucpv5UVVVh8eLFSElJQadOnWot5sopKURkaBxRQkRERERkBMrKynD69GncvHkTGo1Gp+3VV1+VKZX+Hjc9hVNSiKg+sFBCRERERKRwu3fvRmhoKG7dulWrTelTb4iIGppxLRtNRERERNQIRUZGYvjw4cjNzYVGo9G5sUhCRPR0OKKEiIiIiEjhbG1tkZaWZpQ7wxARNTSOKCEiIiIiUrhhw4bhwIEDcscgIjIKHFFCRERERKRwJSUlGD58OBwdHdGxY8daO8NERUXJlIyISHlYKCEiIiIiUrg1a9Zg4sSJsLCwQLNmzSBJkrZNkiRkZWXJmI6ISFlYKCEiIiIiUjgXFxdERUUhNjYWKhVn1xMR6YM/RYmIiIiIFK68vBwjR45kkYSIyAD4k5SIiIiISOHCwsKQnJwsdwwiIqNgKncAIiIiIiLST1VVFRYvXoyUlBR06tSp1mKuy5YtkykZEZHycI0SIiIiIiKFCwoKemSbJEnYv39/A6YhIlI2FkqIiIiIiIiIiKpxjRIiIiIiIiIiomoslBARERERERERVWOhhIiIiIiIiIioGgslREREjVx4eDiGDBmiPe7Tpw+mTp3a4DkOHDgASZJQUFDQ4I9NREREVIOFEiIiomdUeHg4JEmCJElo0qQJWrVqhffffx+VlZX1+rhffvkl5s2b90TXsrhBRERExsZU7gBERET0aAMGDMC6devw+++/Y+fOnYiIiICZmRlmzpypc115eTmaNGlikMd0cHAwyP0QERERKRFHlBARET3DzM3N4eLiAk9PT0yaNAl9+/bFN998o50us2DBAri6uqJt27YAgJycHIwYMQL29vZwcHDA4MGDcfXqVe39VVVVYdq0abC3t0ezZs0wY8YMCCF0HvPBqTe///473n33Xbi7u8Pc3BytWrXCmjVrcPXqVQQFBQEAmjZtCkmSEB4eDgDQaDRYtGgRvL29oVar0blzZ2zdulXncXbu3Ik2bdpArVYjKChIJycRERGRXFgoISIiUhC1Wo3y8nIAwL59+5CZmYm9e/fi22+/RUVFBfr37w8bGxscPnwYR48ehbW1NQYMGKD9mKVLl2L9+vVYu3Ytjhw5gjt37uCrr7567GOGhoZi8+bNiI+Px/nz57Fq1SpYW1vD3d0dX3zxBQAgMzMTubm5+Pe//w0AWLRoETZs2ICVK1fi7NmziI6OxptvvomDBw8CuF/QGTp0KAYNGoT09HSMGzcOsbGx9fW0ERERET0xTr0hIiJSACEE9u3bh5SUFERGRiI/Px9WVlZYvXq1dsrNZ599Bo1Gg9WrV0OSJADAunXrYG9vjwMHDuDll1/GRx99hJkzZ2Lo0KEAgJUrVyIlJeWRj3vhwgVs2bIFe/fuRd++fQEALVq00LbXTNNxcnKCvb09gPsjUBYuXIjvvvsOAQEB2o85cuQIVq1ahd69eyMhIQEtW7bE0qVLAQBt27bFzz//jA8++MCAzxoRERHR02OhhIiI6Bn27bffwtraGhUVFdBoNHjjjTfwz3/+ExEREejYsaPOuiSnTp3CpUuXYGNjo3MfZWVluHz5MgoLC5Gbmwt/f39tm6mpKbp161Zr+k2N9PR0mJiYoHfv3k+c+dKlSygpKUG/fv10zpeXl6NLly4AgPPnz+vkAKAtqhARERHJiYUSIiKiZ1hQUBASEhLQpEkTuLq6wtT0/1+6raysdK69e/cuunbtik2bNtW6H0dHxzo9vlqtfuqPuXv3LgBgx44dcHNz02kzNzevUw4iIiKihsJCCRER0TPMysoKrVq1eqJrX3jhBSQnJ8PJyQm2trYPvaZ58+Y4fvw4AgMDAQCVlZVITU3FCy+88NDrO3bsCI1Gg4MHD2qn3vxRzYiWqqoq7bn27dvD3Nwc2dnZjxyJ4uPjg2+++Ubn3P/+978/7yQRERFRPeNirkREREZi1KhReO655zB48GAcPnwYV65cwYEDBxAVFYVffvkFAPD2228jLi4O27ZtQ0ZGBiZPnoyCgoJH3qeXlxfCwsIwZswYbNu2TXufW7ZsAQB4enpCkiR8++23yM/Px927d2FjY4OYmBhER0cjMTERly9fxsmTJ7F8+XIkJiYCACZOnIiLFy9i+vTpyMzMRFJSEtavX1/fTxERERHRn2KhhIiIyEhYWlri0KFD8PDwwNChQ+Hj44OxY8eirKxMO8LknXfewejRoxEWFoaAgADY2NggJCTksfebkJCAYcOGYfLkyWjXrh3Gjx+Pe/fuAQDc3Nwwd+5cxMbGwtnZGVOmTAEAzJs3D7NmzcKiRYvg4+ODAQMGYMeOHfD29gYAeHh44IsvvsC2bdvQuXNnrFy5EgsXLqzHZ4eIiIjoyUjiUau3ERERERERERE1MhxRQkRERERERERUjYUSIiIiIiIiIqJqLJQQEREREREREVVjoYSIiIiIiIiIqBoLJURERERERERE1VgoISIiIiIiIiKqxkIJEREREREREVE1FkqIiIiIiIiIiKqxUEJEREREREREVI2FEiIiIiIiIiKiaiyUEBERERERERFVY6GEiIiIiIiIiKja/wEj8HryF3eV6wAAAABJRU5ErkJggg==\n"
          },
          "metadata": {}
        }
      ]
    },
    {
      "cell_type": "markdown",
      "source": [
        "## Save fine-tuned model locally"
      ],
      "metadata": {
        "id": "Hr1gTKP8trRb"
      }
    },
    {
      "cell_type": "code",
      "source": [
        "import os\n",
        "\n",
        "TARGET_MODEL_DIR = f\"{dataset.location}/model\"\n",
        "TARGET_MODEL_PATH = f\"{TARGET_MODEL_DIR}/paligemma-3b-pt-224.f16.npz\"\n",
        "\n",
        "os.makedirs(TARGET_MODEL_DIR, exist_ok=True)"
      ],
      "metadata": {
        "id": "N4Y43q4jKj7a"
      },
      "execution_count": null,
      "outputs": []
    },
    {
      "cell_type": "code",
      "source": [
        "flat, _ = big_vision.utils.tree_flatten_with_names(params)\n",
        "with open(TARGET_MODEL_PATH, \"wb\") as f:\n",
        "  np.savez(f, **{k: v for k, v in flat})"
      ],
      "metadata": {
        "id": "zyVxKr2FOxPe"
      },
      "execution_count": null,
      "outputs": []
    },
    {
      "cell_type": "markdown",
      "source": [
        "## Deploy model on Roboflow"
      ],
      "metadata": {
        "id": "7gj7L3BkMZrZ"
      }
    },
    {
      "cell_type": "code",
      "source": [
        "version.deploy(model_type=\"paligemma-3b-pt-224\", model_path=TARGET_MODEL_DIR)"
      ],
      "metadata": {
        "id": "_YfNYT7qMMY2",
        "outputId": "2cbcf204-22d0-45d3-fe29-87cdd663278e",
        "colab": {
          "base_uri": "https://localhost:8080/"
        }
      },
      "execution_count": null,
      "outputs": [
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "Model files found in /content/number-ops-1/model: ['paligemma-3b-pt-224.f16.npz']\n",
            "Found .npz file paligemma-3b-pt-224.f16.npz in model path. Deploying JAX PaliGemma model.\n",
            "Zipping files for deploy: ['paligemma-3b-pt-224.f16.npz']\n",
            "Uploading to Roboflow... May take several minutes.\n",
            "View the status of your deployment at: https://app.roboflow.com/roboflow-jvuqo/number-ops-j1426/1\n",
            "Share your model with the world at: https://universe.roboflow.com/roboflow-jvuqo/number-ops-j1426/model/1\n"
          ]
        }
      ]
    },
    {
      "cell_type": "markdown",
      "source": [
        "# Congratulations\n",
        "\n",
        "⭐️ If you enjoyed this notebook, [**star the Roboflow Notebooks repo**](https://github.com/roboflow/notebooks) (and [**supervision**](https://github.com/roboflow/supervision) while you're at it) and let us know what tutorials you'd like to see us do next. ⭐️"
      ],
      "metadata": {
        "id": "kR8llI4Qv0pR"
      }
    }
  ],
  "metadata": {
    "accelerator": "GPU",
    "colab": {
      "gpuType": "T4",
      "provenance": [],
      "machine_shape": "hm"
    },
    "kernelspec": {
      "display_name": "Python 3",
      "name": "python3"
    },
    "language_info": {
      "name": "python"
    }
  },
  "nbformat": 4,
  "nbformat_minor": 0
}
