{
  "cells": [
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "TtPtBDkEzTmt"
      },
      "source": [
        "[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/pinecone-io/examples/blob/master/learn/search/multilingual/tatoeba-semantic-search-multilingual/tatoeba-semantic-search-multilingual.ipynb)"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "Q3Yn2P_FrqkO"
      },
      "source": [
        "## Semantic Search for better Language Learning with Inference API and Pinecone\n",
        "\n",
        "\n",
        "Multilingual embeddings make it possible to extract information across languages, without translation.\n",
        "\n",
        "In this notebook, we'll introduce you to the [Pinecone Inference API](https://docs.pinecone.io/guides/inference/understanding-inference-api) and multilingual embeddings created by E5 by building a simple language learning search application. At the end of this notebook, you'll learn how to:\n",
        "\n",
        "- use the Inference API to embed new data\n",
        "- use the E5 model on multilingual data\n",
        "- upsert and query in multilingual fashion with Pinecone\n",
        "\n",
        "\n",
        "Let's get started!"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "LezFT_kfKpnK"
      },
      "source": [
        "**What we'll do in this Notebook:**\n",
        "\n",
        "\n",
        "\n",
        "1.   Installation\n",
        "2.   Dataset Prep\n",
        "3.   Using the Inference API for Embedding\n",
        "4.   Indexing into Pinecone\n",
        "5.   Language Learning Semantic Search\n",
        "\n",
        "\n",
        "\n",
        "\n",
        "\n",
        "\n"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "XtTjkdcarqkP"
      },
      "source": [
        "### Installation\n",
        "\n",
        "We'll get started by installing Pinecone and the Hugging Face datasets library, which will let us access the Tatoeba dataset."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "ZYp08hMJrqkP",
        "outputId": "db5de444-5cb6-477b-f50c-b43081bd321a"
      },
      "outputs": [
        {
          "name": "stdout",
          "output_type": "stream",
          "text": [
            "Collecting pinecone\n",
            "  Downloading pinecone-5.1.0-py3-none-any.whl.metadata (19 kB)\n",
            "Collecting datasets\n",
            "  Downloading datasets-2.21.0-py3-none-any.whl.metadata (21 kB)\n",
            "Requirement already satisfied: certifi>=2019.11.17 in /usr/local/lib/python3.10/dist-packages (from pinecone) (2024.8.30)\n",
            "Collecting pinecone-plugin-inference<2.0.0,>=1.0.3 (from pinecone)\n",
            "  Downloading pinecone_plugin_inference-1.0.3-py3-none-any.whl.metadata (2.2 kB)\n",
            "Collecting pinecone-plugin-interface<0.0.8,>=0.0.7 (from pinecone)\n",
            "  Downloading pinecone_plugin_interface-0.0.7-py3-none-any.whl.metadata (1.2 kB)\n",
            "Requirement already satisfied: tqdm>=4.64.1 in /usr/local/lib/python3.10/dist-packages (from pinecone) (4.66.5)\n",
            "Requirement already satisfied: typing-extensions>=3.7.4 in /usr/local/lib/python3.10/dist-packages (from pinecone) (4.12.2)\n",
            "Requirement already satisfied: urllib3>=1.26.0 in /usr/local/lib/python3.10/dist-packages (from pinecone) (2.0.7)\n",
            "Requirement already satisfied: filelock in /usr/local/lib/python3.10/dist-packages (from datasets) (3.15.4)\n",
            "Requirement already satisfied: numpy>=1.17 in /usr/local/lib/python3.10/dist-packages (from datasets) (1.26.4)\n",
            "Collecting pyarrow>=15.0.0 (from datasets)\n",
            "  Downloading pyarrow-17.0.0-cp310-cp310-manylinux_2_28_x86_64.whl.metadata (3.3 kB)\n",
            "Collecting dill<0.3.9,>=0.3.0 (from datasets)\n",
            "  Downloading dill-0.3.8-py3-none-any.whl.metadata (10 kB)\n",
            "Requirement already satisfied: pandas in /usr/local/lib/python3.10/dist-packages (from datasets) (2.1.4)\n",
            "Requirement already satisfied: requests>=2.32.2 in /usr/local/lib/python3.10/dist-packages (from datasets) (2.32.3)\n",
            "Collecting xxhash (from datasets)\n",
            "  Downloading xxhash-3.5.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (12 kB)\n",
            "Collecting multiprocess (from datasets)\n",
            "  Downloading multiprocess-0.70.16-py310-none-any.whl.metadata (7.2 kB)\n",
            "Requirement already satisfied: fsspec<=2024.6.1,>=2023.1.0 in /usr/local/lib/python3.10/dist-packages (from fsspec[http]<=2024.6.1,>=2023.1.0->datasets) (2024.6.1)\n",
            "Requirement already satisfied: aiohttp in /usr/local/lib/python3.10/dist-packages (from datasets) (3.10.5)\n",
            "Requirement already satisfied: huggingface-hub>=0.21.2 in /usr/local/lib/python3.10/dist-packages (from datasets) (0.24.6)\n",
            "Requirement already satisfied: packaging in /usr/local/lib/python3.10/dist-packages (from datasets) (24.1)\n",
            "Requirement already satisfied: pyyaml>=5.1 in /usr/local/lib/python3.10/dist-packages (from datasets) (6.0.2)\n",
            "Requirement already satisfied: aiohappyeyeballs>=2.3.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp->datasets) (2.4.0)\n",
            "Requirement already satisfied: aiosignal>=1.1.2 in /usr/local/lib/python3.10/dist-packages (from aiohttp->datasets) (1.3.1)\n",
            "Requirement already satisfied: attrs>=17.3.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp->datasets) (24.2.0)\n",
            "Requirement already satisfied: frozenlist>=1.1.1 in /usr/local/lib/python3.10/dist-packages (from aiohttp->datasets) (1.4.1)\n",
            "Requirement already satisfied: multidict<7.0,>=4.5 in /usr/local/lib/python3.10/dist-packages (from aiohttp->datasets) (6.0.5)\n",
            "Requirement already satisfied: yarl<2.0,>=1.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp->datasets) (1.9.4)\n",
            "Requirement already satisfied: async-timeout<5.0,>=4.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp->datasets) (4.0.3)\n",
            "Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/lib/python3.10/dist-packages (from requests>=2.32.2->datasets) (3.3.2)\n",
            "Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.10/dist-packages (from requests>=2.32.2->datasets) (3.8)\n",
            "Requirement already satisfied: python-dateutil>=2.8.2 in /usr/local/lib/python3.10/dist-packages (from pandas->datasets) (2.8.2)\n",
            "Requirement already satisfied: pytz>=2020.1 in /usr/local/lib/python3.10/dist-packages (from pandas->datasets) (2024.1)\n",
            "Requirement already satisfied: tzdata>=2022.1 in /usr/local/lib/python3.10/dist-packages (from pandas->datasets) (2024.1)\n",
            "Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.10/dist-packages (from python-dateutil>=2.8.2->pandas->datasets) (1.16.0)\n",
            "Downloading pinecone-5.1.0-py3-none-any.whl (245 kB)\n",
            "\u001b[2K   \u001b[90m\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u001b[0m \u001b[32m245.5/245.5 kB\u001b[0m \u001b[31m4.9 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
            "\u001b[?25hDownloading datasets-2.21.0-py3-none-any.whl (527 kB)\n",
            "\u001b[2K   \u001b[90m\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u001b[0m \u001b[32m527.3/527.3 kB\u001b[0m \u001b[31m19.2 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
            "\u001b[?25hDownloading dill-0.3.8-py3-none-any.whl (116 kB)\n",
            "\u001b[2K   \u001b[90m\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u001b[0m \u001b[32m116.3/116.3 kB\u001b[0m \u001b[31m7.0 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
            "\u001b[?25hDownloading pinecone_plugin_inference-1.0.3-py3-none-any.whl (117 kB)\n",
            "\u001b[2K   \u001b[90m\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u001b[0m \u001b[32m117.6/117.6 kB\u001b[0m \u001b[31m6.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
            "\u001b[?25hDownloading pinecone_plugin_interface-0.0.7-py3-none-any.whl (6.2 kB)\n",
            "Downloading pyarrow-17.0.0-cp310-cp310-manylinux_2_28_x86_64.whl (39.9 MB)\n",
            "\u001b[2K   \u001b[90m\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u001b[0m \u001b[32m39.9/39.9 MB\u001b[0m \u001b[31m21.2 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
            "\u001b[?25hDownloading multiprocess-0.70.16-py310-none-any.whl (134 kB)\n",
            "\u001b[2K   \u001b[90m\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u001b[0m \u001b[32m134.8/134.8 kB\u001b[0m \u001b[31m4.9 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
            "\u001b[?25hDownloading xxhash-3.5.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (194 kB)\n",
            "\u001b[2K   \u001b[90m\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u001b[0m \u001b[32m194.1/194.1 kB\u001b[0m \u001b[31m9.6 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
            "\u001b[?25hInstalling collected packages: xxhash, pyarrow, pinecone-plugin-interface, dill, pinecone-plugin-inference, multiprocess, pinecone, datasets\n",
            "  Attempting uninstall: pyarrow\n",
            "    Found existing installation: pyarrow 14.0.2\n",
            "    Uninstalling pyarrow-14.0.2:\n",
            "      Successfully uninstalled pyarrow-14.0.2\n",
            "\u001b[31mERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n",
            "cudf-cu12 24.4.1 requires pyarrow<15.0.0a0,>=14.0.1, but you have pyarrow 17.0.0 which is incompatible.\n",
            "ibis-framework 8.0.0 requires pyarrow<16,>=2, but you have pyarrow 17.0.0 which is incompatible.\u001b[0m\u001b[31m\n",
            "\u001b[0mSuccessfully installed datasets-2.21.0 dill-0.3.8 multiprocess-0.70.16 pinecone-5.1.0 pinecone-plugin-inference-1.0.3 pinecone-plugin-interface-0.0.7 pyarrow-17.0.0 xxhash-3.5.0\n"
          ]
        }
      ],
      "source": [
        "!pip install pinecone datasets\n",
        "\n",
        "# helper for pinecone connect integrations\n",
        "!pip install -qU pinecone-notebooks"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "AFMjb5BArVsB"
      },
      "source": [
        "## Setting the Scene: Learning a new language!\n",
        "\n",
        "Suppose you are learning Spanish (or English!) and want to describe your weekend trip to a local park in Spanish. You learn about a website called Tatoeba which is for language learners.\n",
        "\n",
        "\n",
        "Tatoeba aims to be a place where people can learn how to say sentences in different languages really easily,\n",
        "and they offer advanced keyword search to enable this capability.\n",
        "\n",
        "\n",
        "You could search \"park\" on Tatoeba, and get back sentences that contain this word. But the word park can also be a verb in English, which may bring up sentences that are irrelevant to your learning.\n",
        "\n",
        "We'll use semantic search to adjust these results to better bring up relevant and semantically similar sentences to your queries.\n",
        "\n",
        "\n",
        "In doing so, we'll approach the problem in two ways:\n",
        "- creating an index with only English text, to demonstrate the workflow\n",
        "- then, we'll create an index with the spanish text, and search over it in English!\n"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "aRijj04OrqkR"
      },
      "source": [
        "![tatoeba_search.png]()"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "63wUihyErqkQ"
      },
      "source": [
        "### Downloading our dataset\n",
        "\n",
        "\n",
        "\n",
        "We'll use Helsinki-NLP's version of the Tatoeba set, hosted on Hugging Face.\n",
        "\n",
        "\n",
        "Using their dataset of translation pairs in English and Spanish, we'll implement semantic search so we can not only bring up translations, but sentences that are\n",
        "semantically similar and relevant for a user, allowing for easier contextual language learning.\n",
        "\n",
        "\n",
        "We'll use English and Spanish translations, but the Inference API's multilingual-e5-large model is compatible with nearly 100 languages, so the possiblities for what you can build are endless!\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 247
        },
        "id": "jbj8onX2wVR-",
        "outputId": "38f2668d-0b21-4b89-f743-6e585836c9b3"
      },
      "outputs": [
        {
          "data": {
            "text/html": [
              "<script type=\"text/javascript\" src=\"https://connect.pinecone.io/embed.js\"></script>"
            ],
            "text/plain": [
              "<IPython.core.display.HTML object>"
            ]
          },
          "metadata": {},
          "output_type": "display_data"
        }
      ],
      "source": [
        "from pinecone_notebooks.colab import Authenticate\n",
        "\n",
        "Authenticate()"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 252,
          "referenced_widgets": [
            "15c8997e32b646bb82462a452e03a83a",
            "d9bbb5950b59442d83bebd4b81b2fbf2",
            "ba6285c26a2f47c096f0fdcccae8afbf",
            "5823795b2b8744b8910d2041d4b3e5cb",
            "3161058b9846448fa9afce347f85dc5d",
            "76eb5b33fa7249a198bd8fec0afb685e",
            "d0152c4ce9f5482b8f3006f069be6e3b",
            "8c321796f4ef46168683e8de011d146c",
            "e273eccb98114bdd9c820ac40f8afcd6",
            "aaff5d54c75d4ec393d93231ff2b58e0",
            "b36b26e49db64e2d8db70d9225d71fb9",
            "f8f5116e61cb49a5ab9e0fc11fa81e5d",
            "5797ee6eaa6a441898bebac99223cfc7",
            "54d183f1ad4141b086ac5605a1007b36",
            "0cb5cb0109594126a0ca603745a52215",
            "d887a67220954822a3579ece47fe9681",
            "ac27fa33d4fc4c3b9b4469174fc9ce04",
            "ef65d05d1b704110b9440f6c55444803",
            "77a302b3b1f0431fbf1186617a389467",
            "bc82772ad57d4dd6971443d30a9f0d5f",
            "3a2cb194db984f2ba7ceeef021a8fe8f",
            "01f37d165d6c41a3bd10141b7021866a",
            "c78d7cf62f52433da3579a22f4a8072d",
            "f7de2e962cac4dde931cccbec97b0ed8",
            "a2b769d4828e411ba327ec16b1f8ca91",
            "6f1c271c04c04178aa67ac151e8cbaa3",
            "4a5f8e7a856b483992cdada1ca6ad474",
            "143f275e06bd4f8dab8042f5500bf778",
            "d829fb965d5e44428a226b917f3d2896",
            "7e7d0b85f92e487bb2789137229e9d50",
            "c42eccdf27064583b2c1c485d49643e6",
            "78e9740ebca643ef82a156a5999f023c",
            "a570110b66e443368667af6245292ea6",
            "7ff358e3f97546a18bb47a71a96b84ce",
            "e20f36a8234f475d89e72fba780cf693",
            "b20b1d5c472842abaa3454ef2b425921",
            "6401c663688840d4a1bac4131a20ca18",
            "daca9c36a451434ea6ca2e51d23962d5",
            "62380415ab3040a1ae930bc0948fa746",
            "b6b5bfc96de84aae9fa3be2d6d66e7e6",
            "1d0c52c5609a41b5b69a08186201b4f8",
            "12a49ba62a354652bf7c7faa26f46508",
            "0ad878b13d044b82b57d8907d8a93892",
            "58df243014084131bdd5ceb768d64964"
          ]
        },
        "id": "qGBOGMqHrqkQ",
        "outputId": "aa9c244e-67f9-4c03-830e-d3cc8774f428"
      },
      "outputs": [
        {
          "name": "stderr",
          "output_type": "stream",
          "text": [
            "/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_token.py:89: UserWarning: \n",
            "The secret `HF_TOKEN` does not exist in your Colab secrets.\n",
            "To authenticate with the Hugging Face Hub, create a token in your settings tab (https://huggingface.co/settings/tokens), set it as secret in your Google Colab and restart your session.\n",
            "You will be able to reuse this secret in all of your notebooks.\n",
            "Please note that authentication is recommended but still optional to access public models or datasets.\n",
            "  warnings.warn(\n"
          ]
        },
        {
          "data": {
            "application/vnd.jupyter.widget-view+json": {
              "model_id": "15c8997e32b646bb82462a452e03a83a",
              "version_major": 2,
              "version_minor": 0
            },
            "text/plain": [
              "Downloading builder script:   0%|          | 0.00/4.41k [00:00<?, ?B/s]"
            ]
          },
          "metadata": {},
          "output_type": "display_data"
        },
        {
          "data": {
            "application/vnd.jupyter.widget-view+json": {
              "model_id": "f8f5116e61cb49a5ab9e0fc11fa81e5d",
              "version_major": 2,
              "version_minor": 0
            },
            "text/plain": [
              "Downloading readme:   0%|          | 0.00/8.93k [00:00<?, ?B/s]"
            ]
          },
          "metadata": {},
          "output_type": "display_data"
        },
        {
          "data": {
            "application/vnd.jupyter.widget-view+json": {
              "model_id": "c78d7cf62f52433da3579a22f4a8072d",
              "version_major": 2,
              "version_minor": 0
            },
            "text/plain": [
              "Downloading data:   0%|          | 0.00/6.88M [00:00<?, ?B/s]"
            ]
          },
          "metadata": {},
          "output_type": "display_data"
        },
        {
          "data": {
            "application/vnd.jupyter.widget-view+json": {
              "model_id": "7ff358e3f97546a18bb47a71a96b84ce",
              "version_major": 2,
              "version_minor": 0
            },
            "text/plain": [
              "Generating train split: 0 examples [00:00, ? examples/s]"
            ]
          },
          "metadata": {},
          "output_type": "display_data"
        }
      ],
      "source": [
        "import os\n",
        "from datasets import load_dataset\n",
        "\n",
        "# specify that we want the english-spanish translation pairs\n",
        "tatoeba = load_dataset(\"Helsinki-NLP/tatoeba\", lang1=\"en\", lang2=\"es\", trust_remote_code=True)\n"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "olSFVVtprqkR"
      },
      "source": [
        "The Tatoeba dataset consists of translated sentence pairs. We'll take advantage of this property to demonstrate the monolingual and cross-lingual capabilities of our model"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "LwGmnpYCrqkR",
        "outputId": "6014459f-b785-4150-c674-b73388fec77a"
      },
      "outputs": [
        {
          "data": {
            "text/plain": [
              "{'id': '0',\n",
              " 'translation': {'en': \"Let's try something.\", 'es': '\u00a1Intentemos algo!'}}"
            ]
          },
          "execution_count": 4,
          "metadata": {},
          "output_type": "execute_result"
        }
      ],
      "source": [
        "tatoeba[\"train\"][0]"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "PSbKQfrorqkR"
      },
      "source": [
        "### Keywords as a foundation\n",
        "\n",
        "\n"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "FUWpr9f-rqkR"
      },
      "source": [
        "\n",
        "We'll use the keyword \"park\" to create an artificial subset of our data. Park has several meanings in the English language, such as a verb (to park) or a noun\n",
        "(play at the park). We'll take advantage of this property to demonstrate how\n",
        "multilingual embeddings can differentate between these meanings."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 49,
          "referenced_widgets": [
            "e43272b613b746e39374d50d92f3ec8e",
            "7dbc1c570e414f05bbca99b5913621c4",
            "a57e3fa6dc704d4bb2c0c852d01ed1db",
            "be2216ccadd54ec2933136d481d4fe51",
            "6ada861acad040248d3e894c748adb89",
            "106fed8661344fc2a0333e61c1ebd8cb",
            "e652be7069cc4facb7eb3d7e6bebf500",
            "d4f877898734464fb03aca5885b0eeef",
            "dadc183066db453fab7080947f6c1be5",
            "c0f92922aca2442b80461c21ced199a3",
            "0c4ced361de947e3a01c07703f2fe153"
          ]
        },
        "id": "WQh5AhWqrqkR",
        "outputId": "d25de929-c92f-40c1-eaa4-6af8f8ed861b"
      },
      "outputs": [
        {
          "data": {
            "application/vnd.jupyter.widget-view+json": {
              "model_id": "e43272b613b746e39374d50d92f3ec8e",
              "version_major": 2,
              "version_minor": 0
            },
            "text/plain": [
              "Filter:   0%|          | 0/214127 [00:00<?, ? examples/s]"
            ]
          },
          "metadata": {},
          "output_type": "display_data"
        }
      ],
      "source": [
        "# grab sentences based on keywords\n",
        "\n",
        "keywords = [\"park\"]\n",
        "\n",
        "def simple_keyword_search(sentence, keywords):\n",
        "  # filter for a list of keywords by sentence\n",
        "    for keyword in keywords:\n",
        "        if keyword in sentence:\n",
        "            return True\n",
        "    return False\n",
        "\n",
        "\n",
        "\n",
        "# filter on english sentences\n",
        "translation_pairs = tatoeba[\"train\"].filter(lambda x: simple_keyword_search(\n",
        "    sentence = x[\"translation\"][\"en\"], keywords=keywords))\n",
        "\n",
        "\n",
        "# flatten and shuffle for ease of use\n",
        "translation_pairs = translation_pairs.flatten()\n",
        "translation_pairs = translation_pairs.shuffle(seed=1)"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "5LXvRKderqkS",
        "outputId": "eeb4f5b7-069f-4169-c4b1-7f0ae813970d"
      },
      "outputs": [
        {
          "data": {
            "text/plain": [
              "Dataset({\n",
              "    features: ['id', 'translation.en', 'translation.es'],\n",
              "    num_rows: 416\n",
              "})"
            ]
          },
          "execution_count": 6,
          "metadata": {},
          "output_type": "execute_result"
        }
      ],
      "source": [
        "translation_pairs"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 35
        },
        "id": "SMLn1M9xrqkS",
        "outputId": "3d8efd24-f033-4085-ff5e-18726e6ba2cf"
      },
      "outputs": [
        {
          "data": {
            "application/vnd.google.colaboratory.intrinsic+json": {
              "type": "string"
            },
            "text/plain": [
              "'When my brother was young, I often used to take him to the park.'"
            ]
          },
          "execution_count": 7,
          "metadata": {},
          "output_type": "execute_result"
        }
      ],
      "source": [
        "translation_pairs[0][\"translation.en\"]"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "KC08-QyErqkS",
        "outputId": "407721b1-15c4-4f6a-9fa7-41495a0fc19f"
      },
      "outputs": [
        {
          "name": "stdout",
          "output_type": "stream",
          "text": [
            "English sentence: When my brother was young, I often used to take him to the park. Spanish Translation: Cuando mi hermano era peque\u00f1o, sol\u00eda llevarle al parque a menudo.\n",
            "English sentence: Yumi went to the park to play tennis. Spanish Translation: Yumi se fue al parque a jugar al tenis.\n",
            "English sentence: Sir, you are not allowed to park your car here. Spanish Translation: No puede estacionarse aqu\u00ed.\n",
            "English sentence: There were a crowd of people in the park. Spanish Translation: Hab\u00eda una multitud en el parque.\n",
            "English sentence: There were almost no cars in the parking lot. Spanish Translation: Casi no hab\u00eda autos en el estacionamiento.\n",
            "English sentence: Tom says we can park on either side of the street. Spanish Translation: Tom dice que podemos aparcar en los dos lados de la calle.\n",
            "English sentence: I have lost my umbrella somewhere in the park. I have to buy one. Spanish Translation: Se me perdi\u00f3 mi paraguas en alguna parte en el parque. Tengo que comprar uno nuevo.\n",
            "English sentence: I like coming to this park at night. Spanish Translation: En las noches me gusta venir a este parque.\n",
            "English sentence: You may park here. Spanish Translation: Te puedes aparcar aqu\u00ed.\n",
            "English sentence: A glass of sparkling water, please. Spanish Translation: Un vaso de agua con gas, por favor.\n"
          ]
        }
      ],
      "source": [
        "for x in range(0, 10):\n",
        "    eng_sentence = translation_pairs[x][\"translation.en\"]\n",
        "    es_sentence = translation_pairs[x][\"translation.es\"]\n",
        "    print(f\"English sentence: {eng_sentence} Spanish Translation: {es_sentence}\")"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "eROF7ILwrqkS"
      },
      "source": [
        "## Setting up our Indexes\n",
        "\n",
        "Next, we'll set up our Pinecone index and configure it with a dimensionality of 1024 to match our embedding model's output vector size."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "ot-aefr8rqkS"
      },
      "outputs": [],
      "source": [
        "from pinecone import Pinecone, ServerlessSpec\n",
        "\n",
        "DIMENSION = 1024\n",
        "INDEX_NAME = \"tatoeba-semantic-search\"\n",
        "\n",
        "pc = Pinecone(source_tag=\"tatoeba_semantic_search_notebook\")\n",
        "\n",
        "\n",
        "def create_index(index_name):\n",
        "\n",
        "    # checks if index already exists. If it does, skip!\n",
        "\n",
        "    if not pc.has_index(index_name):\n",
        "        index = pc.create_index(\n",
        "            name=index_name,\n",
        "            dimension=DIMENSION,\n",
        "            # this is the distance metric that E5 was trained with!\n",
        "            metric=\"cosine\",\n",
        "            spec=ServerlessSpec(\n",
        "                cloud='aws',\n",
        "                region='us-east-1'\n",
        "            )\n",
        "        )\n",
        "        return index\n",
        "\n",
        "\n",
        "create_index(INDEX_NAME)\n"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "EXfEHCIRrqkS"
      },
      "source": [
        "## Embedding with the Pinecone Inference API and E5: Queries and Passages\n",
        "\n",
        "Fortunately for us, the multilingal-e5-large model in the Pinecone Inference API is super easy to use with multilingual data. We don't need to concern ourselves with switching languages, as the endpoint will tokenize and embed for us. However, we do need to make a distinction between queries and passages.\n",
        "\n",
        "**Queries** are the sentences or phrases we'll want to search with, and **passages** are the sentences we'll store in Pinecone. The distinction exists because e5-multilingual-large was trained on datasets that aligned queries with longer form passages such as articles, chapters, etc. We can specify the type of string we are embedding within the inference.embed function.\n",
        "\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "JRK9HebQrqkT"
      },
      "outputs": [],
      "source": [
        "\n",
        "def embed_data(sentences):\n",
        "    # given sentences, pass to Pinecone embedder and return embedding response\n",
        "    embeddings = pc.inference.embed(\n",
        "        model = \"multilingual-e5-large\",\n",
        "\n",
        "        # must be passed as a list\n",
        "        inputs = sentences,\n",
        "        # can also add optional \"truncate\" parameter for longer passages\n",
        "        # passage type, as we are doing a form of relevance search\n",
        "        parameters = {\"input_type\": \"passage\"}\n",
        "    )\n",
        "    return embeddings\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "Tu2z3JObrqkT",
        "outputId": "08c9cf6d-206a-4c13-8e27-4b24111f14dd"
      },
      "outputs": [
        {
          "data": {
            "text/plain": [
              "Dataset({\n",
              "    features: ['id', 'translation.en', 'translation.es'],\n",
              "    num_rows: 416\n",
              "})"
            ]
          },
          "execution_count": 11,
          "metadata": {},
          "output_type": "execute_result"
        }
      ],
      "source": [
        "translation_pairs"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "TTnGf5sYsx11"
      },
      "source": [
        "Let's take a quick look at our data to understand what we're working with!"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "93LtIA4urqkT",
        "outputId": "84113fe3-e4a9-433e-aea9-ac909ff24bf1"
      },
      "outputs": [
        {
          "name": "stdout",
          "output_type": "stream",
          "text": [
            "When my brother was young, I often used to take him to the park. Cuando mi hermano era peque\u00f1o, sol\u00eda llevarle al parque a menudo.\n"
          ]
        }
      ],
      "source": [
        "english_sentence = translation_pairs[0][\"translation.en\"]\n",
        "spanish_translation = translation_pairs[0][\"translation.es\"]\n",
        "\n",
        "print(english_sentence, spanish_translation)"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "NS1A90mj_b0A"
      },
      "outputs": [],
      "source": [
        "en_embedding = embed_data([english_sentence])\n",
        "es_embedding = embed_data([spanish_translation])"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "GJlw0u1XrqkT",
        "outputId": "ba1fcdfd-e253-45db-b89b-f0854fc64d79"
      },
      "outputs": [
        {
          "data": {
            "text/plain": [
              "EmbeddingsList(\n",
              "  model='multilingual-e5-large',\n",
              "  data=[\n",
              "    {'values': [0.017303466796875, -0.0082855224609375, ..., -0.0204315185546875, -0.0007982254028320312]}\n",
              "  ],\n",
              "  usage={'total_tokens': 20}\n",
              ")"
            ]
          },
          "execution_count": 14,
          "metadata": {},
          "output_type": "execute_result"
        }
      ],
      "source": [
        "en_embedding"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "hqgS9BS4rqkT",
        "outputId": "2007ab01-550c-4ab4-ba83-710fd7f6966c"
      },
      "outputs": [
        {
          "data": {
            "text/plain": [
              "EmbeddingsList(\n",
              "  model='multilingual-e5-large',\n",
              "  data=[\n",
              "    {'values': [0.0149993896484375, -0.0218353271484375, ..., -0.0209197998046875, 0.005039215087890625]}\n",
              "  ],\n",
              "  usage={'total_tokens': 20}\n",
              ")"
            ]
          },
          "execution_count": 15,
          "metadata": {},
          "output_type": "execute_result"
        }
      ],
      "source": [
        "es_embedding"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "LGs1_QJJrqkT"
      },
      "source": [
        "## Batch Embedding and Upsertion\n",
        "\n",
        "We can combine both embedding and upserting operations! We'll need to keep in mind that the **batch size for our current model is 96.**\n",
        "\n",
        "For simplicity, we'll batch in this size and upsert at the same time.\n",
        "\n",
        "First, we'll do some data cleaning to make our task a bit clearer and merge our datasets.\n",
        "\n",
        "\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 81,
          "referenced_widgets": [
            "e0ebb49d572c4116a3aa68ea62a6f56d",
            "4f1332040de9475ea1885f132a01a526",
            "faf22507f1df442bbc4698be129acb56",
            "a1b94cabdf1b4705aac79c72357dd41b",
            "ebb5b3d7a3824bcba8b148896559755f",
            "d6fee0a2d190468489b020a2c6c51d70",
            "c0f940f32122472a88c735552c2734d8",
            "073ff4c4d38d4a5890d72cc0b9717dce",
            "b05e982ec7dc4db8a7da24d99448eab8",
            "69ab20d139714352b3810b1f10897f5d",
            "aef05f0369734016b6137e3e8c93242e",
            "45339cbf0c7644e1b448b80e7e489040",
            "8335923822a341cfb753c0a78f72c79b",
            "3d0b743e3c3d4eae9db6ecc88d566f19",
            "40e9368ab4e54ae192fff3be1d2f7dae",
            "1f915a11b05a403492e392d9bcef2468",
            "27ee43edb57d4b41bd1ce9ca198e9d2f",
            "30144d461cd245bd96921a678dc1d92a",
            "26cca199197e463dbffc58dbc9e5fb4b",
            "e93c6b8e261e4a94b3cb8ad712a0e6b0",
            "226ebff3a4174d0c8383eeedc626c1ae",
            "dde681d1310c4cb5abca200e65103d76"
          ]
        },
        "id": "ZA3rG-LQrqkT",
        "outputId": "85b738e5-6f6d-4996-9321-4f891794d3ba"
      },
      "outputs": [
        {
          "data": {
            "application/vnd.jupyter.widget-view+json": {
              "model_id": "e0ebb49d572c4116a3aa68ea62a6f56d",
              "version_major": 2,
              "version_minor": 0
            },
            "text/plain": [
              "Flattening the indices:   0%|          | 0/416 [00:00<?, ? examples/s]"
            ]
          },
          "metadata": {},
          "output_type": "display_data"
        },
        {
          "data": {
            "application/vnd.jupyter.widget-view+json": {
              "model_id": "45339cbf0c7644e1b448b80e7e489040",
              "version_major": 2,
              "version_minor": 0
            },
            "text/plain": [
              "Flattening the indices:   0%|          | 0/416 [00:00<?, ? examples/s]"
            ]
          },
          "metadata": {},
          "output_type": "display_data"
        }
      ],
      "source": [
        "\n",
        "# cleaning columns for ease of use\n",
        "\n",
        "from datasets import concatenate_datasets\n",
        "\n",
        "english_sentences = translation_pairs.rename_column(\"translation.en\", \"text\").remove_columns(\"translation.es\")\n",
        "spanish_sentences = translation_pairs.rename_column(\"translation.es\", \"text\").remove_columns(\"translation.en\")\n",
        "\n",
        "# add lang column to indicate embedding origin\n",
        "english_sentences = english_sentences.add_column(\"lang\", [\"en\"]*len(english_sentences))\n",
        "spanish_sentences = spanish_sentences.add_column(\"lang\", [\"es\"]*len(english_sentences))\n",
        "\n",
        "\n",
        "#merge dataset\n",
        "all_sentences = concatenate_datasets([english_sentences, spanish_sentences])\n",
        "\n",
        "#add id column/overwrite\n",
        "all_sentences = all_sentences.remove_columns(\"id\")\n",
        "all_sentences = all_sentences.add_column(\"id\", range(0 ,len(all_sentences)))\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "6XFWoHdPt6dL",
        "outputId": "6eaff8ab-547e-4991-f714-316de43e1d2a"
      },
      "outputs": [
        {
          "data": {
            "text/plain": [
              "{'text': ['When my brother was young, I often used to take him to the park.',\n",
              "  'Yumi went to the park to play tennis.',\n",
              "  'Sir, you are not allowed to park your car here.',\n",
              "  'There were a crowd of people in the park.',\n",
              "  'There were almost no cars in the parking lot.'],\n",
              " 'lang': ['en', 'en', 'en', 'en', 'en'],\n",
              " 'id': [0, 1, 2, 3, 4]}"
            ]
          },
          "execution_count": 17,
          "metadata": {},
          "output_type": "execute_result"
        }
      ],
      "source": [
        "all_sentences[0:5]\n"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "sikvL7Z8_3-B"
      },
      "source": [
        "When turning the embeddings into vectors for upsertion, we'll store the text and the language the embedding corresponds to as metadata. This will make searching by language over the dataset easier.\n",
        "\n",
        "Lucky for us, we can use [the map function](https://huggingface.co/docs/datasets/en/process#map) from the datasets library to\n",
        "make batch processing painless!\n",
        "\n",
        "The map function allows us to iterate over our dataframe in batches, on which\n",
        "we can embed and upsert our data. This also means we'll only need one pass\n",
        "over our data to finish processing it for our index.\n",
        "\n",
        "Finally, we'll return the embedding as a new column in the returned dataset.\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "Ix4QmShryiHU",
        "outputId": "362b0981-4cce-409f-e17d-1bcf389c9e49"
      },
      "outputs": [
        {
          "data": {
            "text/plain": [
              "[{'values': [0.017303466796875, -0.0082855224609375, ..., -0.0204315185546875, -0.0007982254028320312]},\n",
              " {'values': [0.01403045654296875, -0.00792694091796875, ..., -0.01126861572265625, -0.007434844970703125]},\n",
              " {'values': [0.033782958984375, -0.0380859375, ..., -0.04339599609375, 0.002239227294921875]},\n",
              " {'values': [0.04522705078125, -0.0013141632080078125, ..., -0.01456451416015625, 0.01454925537109375]},\n",
              " {'values': [0.033447265625, -0.020233154296875, ..., -0.0372314453125, -0.004901885986328125]}]"
            ]
          },
          "execution_count": 18,
          "metadata": {},
          "output_type": "execute_result"
        }
      ],
      "source": [
        "embed_data(all_sentences[0:5][\"text\"]).data"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "LAEr56YorqkT"
      },
      "outputs": [],
      "source": [
        "BATCH_SIZE = 96\n",
        "\n",
        "\n",
        "def embed_and_upsert(batch, index):\n",
        "\n",
        "  # Given batch of data as dict from dataset.map\n",
        "  # and Index to upsert with Pinecone\n",
        "  # embed and upsert the batch\n",
        "  # and, return the embeddings as a new column\n",
        "\n",
        "\n",
        "  # embed the batch and pull the embeddings\n",
        "  embeddings = embed_data(batch[\"text\"]).data\n",
        "  # reshape for easier indexing\n",
        "  embeddings = [d[\"values\"] for d in embeddings]\n",
        "\n",
        "  # reformat into vectors, for upsertion\n",
        "  vectors = []\n",
        "  for i in range(0, len(embeddings)):\n",
        "\n",
        "    id = str(batch[\"id\"][i])\n",
        "    embedding = embeddings[i]\n",
        "    text = batch[\"text\"][i]\n",
        "    lang = batch[\"lang\"][i]\n",
        "\n",
        "    vector = {\n",
        "        \"id\": id,\n",
        "        \"values\": embedding,\n",
        "        \"metadata\":{\n",
        "            \"text\": text,\n",
        "            \"lang\": lang,\n",
        "        },\n",
        "    }\n",
        "    vectors.append(vector)\n",
        "\n",
        "  # upsert # consider outputing stuff here to indicate\n",
        "  index.upsert(\n",
        "      vectors = vectors\n",
        "  )\n",
        "\n",
        "  # return as a new column\n",
        "  return {\"embeddings\": embeddings}\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 105,
          "referenced_widgets": [
            "861e03344df7459ab3a5fb7cd5de0ecd",
            "69ed82399a04422b8884d1bb11d791e9",
            "5899b52e36674800b5ec6e1170b05a1a",
            "cd372f86ea494187844974201de29e81",
            "9c4653a532564a5dbc9a281c3fd82825",
            "c2bf63621dd34f4c8c751aa9049745e1",
            "fe6b4f7ebe064215b6daf207acac4ba6",
            "05655faa18c94c0f8cb0e1d8bd94207d",
            "1df44960fae7451db9773a75fe6e06d7",
            "4d5436bff2294062a4e69d80f5b59cab",
            "65cb940f24094a75a265a3522f96b614"
          ]
        },
        "id": "obtBoGHu1B3F",
        "outputId": "bb11ee74-0fc0-47ae-946c-4da09aebb3ad"
      },
      "outputs": [
        {
          "name": "stderr",
          "output_type": "stream",
          "text": [
            "Parameter 'function'=<function <lambda> at 0x7cabb4eca0e0> of the transform datasets.arrow_dataset.Dataset._map_single couldn't be hashed properly, a random hash was used instead. Make sure your transforms and parameters are serializable with pickle or dill for the dataset fingerprinting and caching to work. If you reuse this transform, the caching mechanism will consider it to be different from the previous calls and recompute everything. This warning is only showed once. Subsequent hashing failures won't be showed.\n",
            "WARNING:datasets.fingerprint:Parameter 'function'=<function <lambda> at 0x7cabb4eca0e0> of the transform datasets.arrow_dataset.Dataset._map_single couldn't be hashed properly, a random hash was used instead. Make sure your transforms and parameters are serializable with pickle or dill for the dataset fingerprinting and caching to work. If you reuse this transform, the caching mechanism will consider it to be different from the previous calls and recompute everything. This warning is only showed once. Subsequent hashing failures won't be showed.\n"
          ]
        },
        {
          "data": {
            "application/vnd.jupyter.widget-view+json": {
              "model_id": "861e03344df7459ab3a5fb7cd5de0ecd",
              "version_major": 2,
              "version_minor": 0
            },
            "text/plain": [
              "Map:   0%|          | 0/832 [00:00<?, ? examples/s]"
            ]
          },
          "metadata": {},
          "output_type": "display_data"
        }
      ],
      "source": [
        "index = pc.Index(INDEX_NAME)\n",
        "final_ds = all_sentences.map(lambda batch: embed_and_upsert(batch=batch, index=index),\n",
        "                                           batched=True,\n",
        "                                           batch_size=BATCH_SIZE)"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "wjrhgi7VNqqL",
        "outputId": "dd7d400b-d1de-4948-ff37-6d3809dc42e9"
      },
      "outputs": [
        {
          "data": {
            "text/plain": [
              "{'text': 'When my brother was young, I often used to take him to the park.',\n",
              " 'lang': 'en',\n",
              " 'id': 0,\n",
              " 'embeddings': [0.017303466796875,\n",
              "  -0.0082855224609375,\n",
              "  -0.027801513671875,\n",
              "  -0.021484375,\n",
              "  0.00833892822265625,\n",
              "  -0.05487060546875,\n",
              "  -0.0041046142578125,\n",
              "  0.0406494140625,\n",
              "  0.07781982421875,\n",
              "  -0.0136260986328125,\n",
              "  0.02691650390625,\n",
              "  0.00975799560546875,\n",
              "  -0.059173583984375,\n",
              "  0.00798797607421875,\n",
              "  0.004856109619140625,\n",
              "  0.0190277099609375,\n",
              "  -0.00975799560546875,\n",
              "  0.04913330078125,\n",
              "  0.006168365478515625,\n",
              "  -0.019989013671875,\n",
              "  0.0594482421875,\n",
              "  -0.0294342041015625,\n",
              "  -0.0450439453125,\n",
              "  -0.038421630859375,\n",
              "  -0.0374755859375,\n",
              "  -0.04534912109375,\n",
              "  -0.035858154296875,\n",
              "  -0.0267181396484375,\n",
              "  -0.018890380859375,\n",
              "  -0.054046630859375,\n",
              "  -0.00766754150390625,\n",
              "  0.007450103759765625,\n",
              "  -0.0384521484375,\n",
              "  -0.04071044921875,\n",
              "  -0.0267486572265625,\n",
              "  0.042510986328125,\n",
              "  0.0237274169921875,\n",
              "  0.00766754150390625,\n",
              "  -0.0146331787109375,\n",
              "  0.0167999267578125,\n",
              "  -0.0047760009765625,\n",
              "  -0.0014858245849609375,\n",
              "  0.0031871795654296875,\n",
              "  -0.00762176513671875,\n",
              "  -0.0252838134765625,\n",
              "  -0.005054473876953125,\n",
              "  0.045501708984375,\n",
              "  0.020111083984375,\n",
              "  -0.026092529296875,\n",
              "  0.03076171875,\n",
              "  0.0225982666015625,\n",
              "  -0.007167816162109375,\n",
              "  -0.0472412109375,\n",
              "  -0.037628173828125,\n",
              "  -0.0562744140625,\n",
              "  0.028076171875,\n",
              "  -0.021270751953125,\n",
              "  0.0177459716796875,\n",
              "  -0.034393310546875,\n",
              "  -0.0011196136474609375,\n",
              "  -0.004863739013671875,\n",
              "  -0.0650634765625,\n",
              "  0.01555633544921875,\n",
              "  -0.0361328125,\n",
              "  -0.0438232421875,\n",
              "  0.07147216796875,\n",
              "  0.033843994140625,\n",
              "  0.009033203125,\n",
              "  -0.058013916015625,\n",
              "  0.038604736328125,\n",
              "  -0.0435791015625,\n",
              "  0.022125244140625,\n",
              "  -0.01242828369140625,\n",
              "  -0.0176849365234375,\n",
              "  0.003383636474609375,\n",
              "  -0.035736083984375,\n",
              "  0.0230560302734375,\n",
              "  0.01018524169921875,\n",
              "  0.0201416015625,\n",
              "  -0.0156402587890625,\n",
              "  0.0772705078125,\n",
              "  0.01021575927734375,\n",
              "  0.02642822265625,\n",
              "  -0.0269317626953125,\n",
              "  0.053924560546875,\n",
              "  0.057647705078125,\n",
              "  0.063720703125,\n",
              "  0.01751708984375,\n",
              "  0.033782958984375,\n",
              "  0.033477783203125,\n",
              "  -0.0197906494140625,\n",
              "  0.033905029296875,\n",
              "  0.0233306884765625,\n",
              "  -0.045928955078125,\n",
              "  -0.04388427734375,\n",
              "  0.01163482666015625,\n",
              "  0.05224609375,\n",
              "  0.0189208984375,\n",
              "  0.0110931396484375,\n",
              "  0.01287841796875,\n",
              "  0.00733184814453125,\n",
              "  -0.0208282470703125,\n",
              "  0.00839996337890625,\n",
              "  -0.0113372802734375,\n",
              "  0.006305694580078125,\n",
              "  0.06549072265625,\n",
              "  0.01424407958984375,\n",
              "  0.00278472900390625,\n",
              "  -0.0173797607421875,\n",
              "  0.0253753662109375,\n",
              "  0.06231689453125,\n",
              "  0.030364990234375,\n",
              "  0.057708740234375,\n",
              "  -0.012115478515625,\n",
              "  -0.0107421875,\n",
              "  0.01971435546875,\n",
              "  0.034820556640625,\n",
              "  -0.026611328125,\n",
              "  -0.041259765625,\n",
              "  0.051055908203125,\n",
              "  0.046905517578125,\n",
              "  0.06329345703125,\n",
              "  0.004669189453125,\n",
              "  -0.0237274169921875,\n",
              "  -0.003692626953125,\n",
              "  0.01145172119140625,\n",
              "  0.00841522216796875,\n",
              "  0.039276123046875,\n",
              "  -0.0150604248046875,\n",
              "  -0.01149749755859375,\n",
              "  0.033050537109375,\n",
              "  0.034759521484375,\n",
              "  -0.03997802734375,\n",
              "  -0.060882568359375,\n",
              "  -0.007602691650390625,\n",
              "  -0.026702880859375,\n",
              "  -0.035430908203125,\n",
              "  -0.0033092498779296875,\n",
              "  -0.0258331298828125,\n",
              "  0.0032596588134765625,\n",
              "  -0.0013360977172851562,\n",
              "  0.029388427734375,\n",
              "  0.0704345703125,\n",
              "  -0.05035400390625,\n",
              "  -0.044342041015625,\n",
              "  -0.0029659271240234375,\n",
              "  -0.0406494140625,\n",
              "  0.0156707763671875,\n",
              "  0.0037250518798828125,\n",
              "  -0.002925872802734375,\n",
              "  0.00798797607421875,\n",
              "  -0.016632080078125,\n",
              "  -0.0036563873291015625,\n",
              "  -0.00867462158203125,\n",
              "  0.0193634033203125,\n",
              "  -0.03839111328125,\n",
              "  -0.0037975311279296875,\n",
              "  0.035064697265625,\n",
              "  0.057281494140625,\n",
              "  -0.00907135009765625,\n",
              "  -0.028167724609375,\n",
              "  -0.0748291015625,\n",
              "  -0.0249786376953125,\n",
              "  -0.0228118896484375,\n",
              "  0.004558563232421875,\n",
              "  -0.026885986328125,\n",
              "  0.03680419921875,\n",
              "  0.0200042724609375,\n",
              "  -0.005214691162109375,\n",
              "  0.017059326171875,\n",
              "  -0.01364898681640625,\n",
              "  -0.01409912109375,\n",
              "  0.0310211181640625,\n",
              "  -0.040924072265625,\n",
              "  -0.021026611328125,\n",
              "  0.032958984375,\n",
              "  0.0007581710815429688,\n",
              "  0.01360321044921875,\n",
              "  0.06292724609375,\n",
              "  -0.02978515625,\n",
              "  0.034698486328125,\n",
              "  0.0262908935546875,\n",
              "  0.037200927734375,\n",
              "  -0.01201629638671875,\n",
              "  -0.0206756591796875,\n",
              "  0.0304107666015625,\n",
              "  0.0222930908203125,\n",
              "  0.0117950439453125,\n",
              "  0.0052947998046875,\n",
              "  0.023101806640625,\n",
              "  0.017822265625,\n",
              "  -0.0338134765625,\n",
              "  -0.0190582275390625,\n",
              "  -0.009735107421875,\n",
              "  -0.026092529296875,\n",
              "  0.043914794921875,\n",
              "  -0.01535797119140625,\n",
              "  0.028167724609375,\n",
              "  -0.034423828125,\n",
              "  -0.04681396484375,\n",
              "  0.015777587890625,\n",
              "  0.0081634521484375,\n",
              "  -0.05377197265625,\n",
              "  0.0162200927734375,\n",
              "  0.027984619140625,\n",
              "  -0.005340576171875,\n",
              "  -0.0394287109375,\n",
              "  -0.0435791015625,\n",
              "  0.052459716796875,\n",
              "  -0.0791015625,\n",
              "  -0.0478515625,\n",
              "  -0.005802154541015625,\n",
              "  0.03155517578125,\n",
              "  0.033416748046875,\n",
              "  -0.01446533203125,\n",
              "  -0.06292724609375,\n",
              "  -0.031097412109375,\n",
              "  -0.002590179443359375,\n",
              "  0.0230560302734375,\n",
              "  0.040802001953125,\n",
              "  -0.01004791259765625,\n",
              "  0.051422119140625,\n",
              "  0.00380706787109375,\n",
              "  0.02069091796875,\n",
              "  -0.009063720703125,\n",
              "  0.02484130859375,\n",
              "  0.0391845703125,\n",
              "  0.031707763671875,\n",
              "  0.0038776397705078125,\n",
              "  -0.006305694580078125,\n",
              "  0.0184326171875,\n",
              "  0.0188140869140625,\n",
              "  -0.04852294921875,\n",
              "  -0.01097869873046875,\n",
              "  -0.0074615478515625,\n",
              "  -0.0142822265625,\n",
              "  -0.01263427734375,\n",
              "  -0.0097198486328125,\n",
              "  0.037078857421875,\n",
              "  0.0283050537109375,\n",
              "  -0.04058837890625,\n",
              "  0.024658203125,\n",
              "  0.0209197998046875,\n",
              "  0.007175445556640625,\n",
              "  0.0244293212890625,\n",
              "  -0.010589599609375,\n",
              "  0.00897216796875,\n",
              "  -0.028594970703125,\n",
              "  0.01393890380859375,\n",
              "  -0.016265869140625,\n",
              "  0.01459503173828125,\n",
              "  0.032318115234375,\n",
              "  -0.037109375,\n",
              "  -0.03155517578125,\n",
              "  -0.005107879638671875,\n",
              "  0.01413726806640625,\n",
              "  0.044036865234375,\n",
              "  0.0029582977294921875,\n",
              "  0.0168914794921875,\n",
              "  -0.0391845703125,\n",
              "  0.005298614501953125,\n",
              "  -0.00372314453125,\n",
              "  -0.00148773193359375,\n",
              "  0.041717529296875,\n",
              "  0.01354217529296875,\n",
              "  0.0169830322265625,\n",
              "  -0.0015096664428710938,\n",
              "  -0.025726318359375,\n",
              "  -0.0287933349609375,\n",
              "  -0.057586669921875,\n",
              "  -0.03900146484375,\n",
              "  -0.034027099609375,\n",
              "  -0.05291748046875,\n",
              "  -0.041534423828125,\n",
              "  -0.0102081298828125,\n",
              "  -0.0085296630859375,\n",
              "  -0.070068359375,\n",
              "  -0.002956390380859375,\n",
              "  -0.02496337890625,\n",
              "  0.012603759765625,\n",
              "  0.0160064697265625,\n",
              "  -0.0130767822265625,\n",
              "  0.0240478515625,\n",
              "  -0.058349609375,\n",
              "  0.004283905029296875,\n",
              "  -0.021148681640625,\n",
              "  -0.022125244140625,\n",
              "  0.0107574462890625,\n",
              "  -0.00986480712890625,\n",
              "  -0.01959228515625,\n",
              "  0.0138397216796875,\n",
              "  -0.05364990234375,\n",
              "  0.058349609375,\n",
              "  0.045379638671875,\n",
              "  0.020782470703125,\n",
              "  0.0039520263671875,\n",
              "  -0.0292205810546875,\n",
              "  -0.0716552734375,\n",
              "  0.00505828857421875,\n",
              "  0.005298614501953125,\n",
              "  -0.03070068359375,\n",
              "  -0.058807373046875,\n",
              "  0.032257080078125,\n",
              "  0.0150604248046875,\n",
              "  -0.034912109375,\n",
              "  -0.03753662109375,\n",
              "  0.01788330078125,\n",
              "  0.006381988525390625,\n",
              "  -0.04400634765625,\n",
              "  0.03057861328125,\n",
              "  0.08856201171875,\n",
              "  0.046661376953125,\n",
              "  0.030853271484375,\n",
              "  -0.0310211181640625,\n",
              "  -0.0322265625,\n",
              "  -0.026092529296875,\n",
              "  -0.027252197265625,\n",
              "  0.0726318359375,\n",
              "  -0.045196533203125,\n",
              "  0.038726806640625,\n",
              "  -0.0006809234619140625,\n",
              "  -0.0046234130859375,\n",
              "  0.014190673828125,\n",
              "  0.00909423828125,\n",
              "  0.0202789306640625,\n",
              "  0.0007386207580566406,\n",
              "  0.012786865234375,\n",
              "  -0.0248870849609375,\n",
              "  0.0004792213439941406,\n",
              "  -0.0297088623046875,\n",
              "  -0.00299072265625,\n",
              "  -0.0269622802734375,\n",
              "  -0.00119781494140625,\n",
              "  0.055877685546875,\n",
              "  -0.05450439453125,\n",
              "  0.02899169921875,\n",
              "  -0.0286407470703125,\n",
              "  0.0215606689453125,\n",
              "  0.039886474609375,\n",
              "  0.00409698486328125,\n",
              "  0.045257568359375,\n",
              "  -0.00571441650390625,\n",
              "  -0.10052490234375,\n",
              "  -0.042449951171875,\n",
              "  0.019805908203125,\n",
              "  -0.01079559326171875,\n",
              "  0.026580810546875,\n",
              "  -0.03155517578125,\n",
              "  0.036285400390625,\n",
              "  0.022430419921875,\n",
              "  -0.0008540153503417969,\n",
              "  -0.0260467529296875,\n",
              "  -0.005519866943359375,\n",
              "  -0.0016698837280273438,\n",
              "  -0.0039043426513671875,\n",
              "  0.02935791015625,\n",
              "  0.0361328125,\n",
              "  -0.03466796875,\n",
              "  -0.004512786865234375,\n",
              "  -0.026153564453125,\n",
              "  0.0238800048828125,\n",
              "  0.00891876220703125,\n",
              "  0.029449462890625,\n",
              "  -0.00943756103515625,\n",
              "  -0.0015811920166015625,\n",
              "  -0.011505126953125,\n",
              "  0.01233673095703125,\n",
              "  0.00914764404296875,\n",
              "  -0.04168701171875,\n",
              "  -0.0408935546875,\n",
              "  0.01291656494140625,\n",
              "  0.0230865478515625,\n",
              "  -0.01396942138671875,\n",
              "  0.00728607177734375,\n",
              "  0.0252838134765625,\n",
              "  0.0022258758544921875,\n",
              "  -0.062744140625,\n",
              "  -0.0175933837890625,\n",
              "  -0.0008106231689453125,\n",
              "  -0.0189971923828125,\n",
              "  -0.0611572265625,\n",
              "  -0.0312347412109375,\n",
              "  0.01232147216796875,\n",
              "  -0.038848876953125,\n",
              "  -0.0194549560546875,\n",
              "  -0.0274810791015625,\n",
              "  0.152099609375,\n",
              "  -0.001941680908203125,\n",
              "  0.06634521484375,\n",
              "  -0.035186767578125,\n",
              "  -0.01210784912109375,\n",
              "  0.010009765625,\n",
              "  -0.0304107666015625,\n",
              "  0.0161895751953125,\n",
              "  0.0258331298828125,\n",
              "  0.0283966064453125,\n",
              "  -0.0306243896484375,\n",
              "  0.048248291015625,\n",
              "  0.017364501953125,\n",
              "  -0.033905029296875,\n",
              "  0.00814056396484375,\n",
              "  0.044830322265625,\n",
              "  0.00739288330078125,\n",
              "  0.01055145263671875,\n",
              "  -0.0010061264038085938,\n",
              "  -0.0118255615234375,\n",
              "  0.048492431640625,\n",
              "  0.0012865066528320312,\n",
              "  0.0701904296875,\n",
              "  0.00011867284774780273,\n",
              "  -0.04376220703125,\n",
              "  -0.03240966796875,\n",
              "  0.030364990234375,\n",
              "  0.01178741455078125,\n",
              "  -0.01348114013671875,\n",
              "  0.0259552001953125,\n",
              "  -0.0021514892578125,\n",
              "  0.0267486572265625,\n",
              "  -0.041534423828125,\n",
              "  0.01525115966796875,\n",
              "  0.04693603515625,\n",
              "  0.0312042236328125,\n",
              "  0.041961669921875,\n",
              "  -0.032684326171875,\n",
              "  0.0245513916015625,\n",
              "  0.00022077560424804688,\n",
              "  -0.0032749176025390625,\n",
              "  -0.00079345703125,\n",
              "  -0.031768798828125,\n",
              "  0.0256805419921875,\n",
              "  0.00333404541015625,\n",
              "  0.0382080078125,\n",
              "  0.007053375244140625,\n",
              "  -0.0216064453125,\n",
              "  -0.04376220703125,\n",
              "  0.016754150390625,\n",
              "  0.0018396377563476562,\n",
              "  0.053802490234375,\n",
              "  -0.021881103515625,\n",
              "  -0.03326416015625,\n",
              "  -0.004909515380859375,\n",
              "  -0.0270233154296875,\n",
              "  -0.022796630859375,\n",
              "  -0.01357269287109375,\n",
              "  -0.00817108154296875,\n",
              "  -0.004428863525390625,\n",
              "  0.036529541015625,\n",
              "  0.0634765625,\n",
              "  -0.0125885009765625,\n",
              "  -0.032867431640625,\n",
              "  0.0258941650390625,\n",
              "  0.02978515625,\n",
              "  -0.0141754150390625,\n",
              "  -0.053497314453125,\n",
              "  -0.017974853515625,\n",
              "  -0.015777587890625,\n",
              "  -0.006023406982421875,\n",
              "  -0.023651123046875,\n",
              "  -0.01436614990234375,\n",
              "  0.00634002685546875,\n",
              "  0.0248870849609375,\n",
              "  -0.0220794677734375,\n",
              "  -0.0341796875,\n",
              "  0.03076171875,\n",
              "  0.02874755859375,\n",
              "  -0.01251983642578125,\n",
              "  0.030303955078125,\n",
              "  -0.0299835205078125,\n",
              "  -0.0230560302734375,\n",
              "  0.024627685546875,\n",
              "  0.03863525390625,\n",
              "  -0.0184478759765625,\n",
              "  0.052093505859375,\n",
              "  0.0341796875,\n",
              "  -0.0131378173828125,\n",
              "  0.0122222900390625,\n",
              "  0.0323486328125,\n",
              "  0.047454833984375,\n",
              "  -0.01380157470703125,\n",
              "  0.040283203125,\n",
              "  -0.01093292236328125,\n",
              "  -0.04327392578125,\n",
              "  0.028594970703125,\n",
              "  0.008544921875,\n",
              "  -0.039459228515625,\n",
              "  -0.030242919921875,\n",
              "  -0.013580322265625,\n",
              "  -0.01235198974609375,\n",
              "  -0.022796630859375,\n",
              "  -0.0369873046875,\n",
              "  -0.02130126953125,\n",
              "  -0.004627227783203125,\n",
              "  0.003696441650390625,\n",
              "  0.0662841796875,\n",
              "  0.0214385986328125,\n",
              "  -0.015899658203125,\n",
              "  0.05120849609375,\n",
              "  0.0099639892578125,\n",
              "  0.04052734375,\n",
              "  -0.0098724365234375,\n",
              "  0.0167694091796875,\n",
              "  0.0240325927734375,\n",
              "  0.05792236328125,\n",
              "  -0.0452880859375,\n",
              "  0.057830810546875,\n",
              "  0.0031871795654296875,\n",
              "  -0.0179443359375,\n",
              "  0.042266845703125,\n",
              "  -0.0318603515625,\n",
              "  0.029327392578125,\n",
              "  -0.0027751922607421875,\n",
              "  0.0750732421875,\n",
              "  -0.0252838134765625,\n",
              "  -0.01175689697265625,\n",
              "  0.0210418701171875,\n",
              "  0.04571533203125,\n",
              "  0.01110076904296875,\n",
              "  -0.033660888671875,\n",
              "  -0.04156494140625,\n",
              "  0.00873565673828125,\n",
              "  0.018218994140625,\n",
              "  -0.046173095703125,\n",
              "  0.0205535888671875,\n",
              "  -0.029876708984375,\n",
              "  0.0221710205078125,\n",
              "  0.0188446044921875,\n",
              "  0.0038738250732421875,\n",
              "  0.0089569091796875,\n",
              "  -0.049530029296875,\n",
              "  0.00844573974609375,\n",
              "  0.0225982666015625,\n",
              "  0.03521728515625,\n",
              "  -0.053497314453125,\n",
              "  -0.033416748046875,\n",
              "  -0.028472900390625,\n",
              "  -0.013824462890625,\n",
              "  0.01959228515625,\n",
              "  0.007843017578125,\n",
              "  -0.013214111328125,\n",
              "  0.007205963134765625,\n",
              "  -0.00818634033203125,\n",
              "  -0.0400390625,\n",
              "  0.0205230712890625,\n",
              "  0.0193328857421875,\n",
              "  0.0430908203125,\n",
              "  0.004131317138671875,\n",
              "  -0.03411865234375,\n",
              "  -0.0085906982421875,\n",
              "  -0.004669189453125,\n",
              "  0.08306884765625,\n",
              "  0.032012939453125,\n",
              "  0.07745361328125,\n",
              "  -0.05914306640625,\n",
              "  0.057769775390625,\n",
              "  0.032684326171875,\n",
              "  0.003383636474609375,\n",
              "  -0.0216827392578125,\n",
              "  -0.01544952392578125,\n",
              "  -0.0024471282958984375,\n",
              "  0.01226806640625,\n",
              "  0.0260467529296875,\n",
              "  0.006671905517578125,\n",
              "  -0.00897979736328125,\n",
              "  0.0469970703125,\n",
              "  -0.06488037109375,\n",
              "  0.0011539459228515625,\n",
              "  -0.0225830078125,\n",
              "  0.03863525390625,\n",
              "  0.01678466796875,\n",
              "  -0.00665283203125,\n",
              "  -0.0458984375,\n",
              "  0.00609588623046875,\n",
              "  0.03759765625,\n",
              "  -0.036407470703125,\n",
              "  -0.0036640167236328125,\n",
              "  0.0224761962890625,\n",
              "  -0.040283203125,\n",
              "  0.05853271484375,\n",
              "  -0.038909912109375,\n",
              "  0.014862060546875,\n",
              "  -0.048797607421875,\n",
              "  0.004055023193359375,\n",
              "  0.0186920166015625,\n",
              "  0.01296234130859375,\n",
              "  -0.03167724609375,\n",
              "  -0.0179290771484375,\n",
              "  -0.0102386474609375,\n",
              "  -0.055633544921875,\n",
              "  -0.0233306884765625,\n",
              "  0.026153564453125,\n",
              "  -0.01229095458984375,\n",
              "  0.00786590576171875,\n",
              "  0.0055999755859375,\n",
              "  0.005390167236328125,\n",
              "  0.0308990478515625,\n",
              "  0.036346435546875,\n",
              "  -0.0033893585205078125,\n",
              "  -0.035369873046875,\n",
              "  0.004795074462890625,\n",
              "  0.0010013580322265625,\n",
              "  0.03656005859375,\n",
              "  0.039154052734375,\n",
              "  0.02569580078125,\n",
              "  0.031097412109375,\n",
              "  -0.0006604194641113281,\n",
              "  -0.01287841796875,\n",
              "  0.016876220703125,\n",
              "  0.0227508544921875,\n",
              "  0.006450653076171875,\n",
              "  -0.04296875,\n",
              "  -0.062744140625,\n",
              "  0.00345611572265625,\n",
              "  0.020355224609375,\n",
              "  0.0158843994140625,\n",
              "  0.006061553955078125,\n",
              "  -0.032470703125,\n",
              "  0.005649566650390625,\n",
              "  -0.03973388671875,\n",
              "  -0.054046630859375,\n",
              "  -0.049713134765625,\n",
              "  -0.03070068359375,\n",
              "  -0.0018072128295898438,\n",
              "  0.0390625,\n",
              "  0.00736236572265625,\n",
              "  -0.0028095245361328125,\n",
              "  -0.027557373046875,\n",
              "  0.01374053955078125,\n",
              "  -0.0513916015625,\n",
              "  0.040313720703125,\n",
              "  -0.0019521713256835938,\n",
              "  -0.0177459716796875,\n",
              "  -0.00958251953125,\n",
              "  -0.038604736328125,\n",
              "  -0.0184173583984375,\n",
              "  0.0113067626953125,\n",
              "  0.042999267578125,\n",
              "  0.002498626708984375,\n",
              "  -0.0242462158203125,\n",
              "  0.01204681396484375,\n",
              "  0.0181884765625,\n",
              "  -0.00286102294921875,\n",
              "  -0.0188446044921875,\n",
              "  0.02093505859375,\n",
              "  -0.0026683807373046875,\n",
              "  -0.04901123046875,\n",
              "  -0.050537109375,\n",
              "  0.01128387451171875,\n",
              "  -0.0142974853515625,\n",
              "  -0.0181427001953125,\n",
              "  0.02392578125,\n",
              "  -0.0634765625,\n",
              "  0.02728271484375,\n",
              "  0.01024627685546875,\n",
              "  -0.04705810546875,\n",
              "  0.004352569580078125,\n",
              "  -0.01690673828125,\n",
              "  0.0328369140625,\n",
              "  -0.01373291015625,\n",
              "  -0.0400390625,\n",
              "  -0.07275390625,\n",
              "  0.054290771484375,\n",
              "  0.025238037109375,\n",
              "  -0.07568359375,\n",
              "  0.00762176513671875,\n",
              "  -0.0008940696716308594,\n",
              "  -0.025177001953125,\n",
              "  -0.04888916015625,\n",
              "  0.00786590576171875,\n",
              "  -0.0109405517578125,\n",
              "  -0.018341064453125,\n",
              "  0.0265045166015625,\n",
              "  -0.0341796875,\n",
              "  -0.03216552734375,\n",
              "  0.025665283203125,\n",
              "  -0.0595703125,\n",
              "  -0.00571441650390625,\n",
              "  -0.022735595703125,\n",
              "  0.0158233642578125,\n",
              "  -0.046356201171875,\n",
              "  0.0249481201171875,\n",
              "  -0.01904296875,\n",
              "  -0.0670166015625,\n",
              "  -0.0208282470703125,\n",
              "  -0.0183258056640625,\n",
              "  -0.00811767578125,\n",
              "  -0.0272064208984375,\n",
              "  0.005126953125,\n",
              "  -0.002513885498046875,\n",
              "  -0.0283050537109375,\n",
              "  -0.053619384765625,\n",
              "  -0.04693603515625,\n",
              "  0.0284271240234375,\n",
              "  0.00412750244140625,\n",
              "  -0.046722412109375,\n",
              "  0.04852294921875,\n",
              "  0.0266571044921875,\n",
              "  -0.0143890380859375,\n",
              "  -0.005023956298828125,\n",
              "  0.0167694091796875,\n",
              "  0.0104217529296875,\n",
              "  0.020965576171875,\n",
              "  -0.021728515625,\n",
              "  -0.02099609375,\n",
              "  -0.011505126953125,\n",
              "  -0.0207672119140625,\n",
              "  0.047119140625,\n",
              "  -0.0161895751953125,\n",
              "  -0.0102081298828125,\n",
              "  -0.014739990234375,\n",
              "  0.0271148681640625,\n",
              "  0.0290374755859375,\n",
              "  0.0267181396484375,\n",
              "  0.029144287109375,\n",
              "  0.020904541015625,\n",
              "  0.0311126708984375,\n",
              "  0.028594970703125,\n",
              "  0.020294189453125,\n",
              "  -0.0031795501708984375,\n",
              "  -0.030487060546875,\n",
              "  0.006755828857421875,\n",
              "  0.00379180908203125,\n",
              "  -0.0086517333984375,\n",
              "  -0.0120086669921875,\n",
              "  0.00811004638671875,\n",
              "  -0.0008883476257324219,\n",
              "  -0.037322998046875,\n",
              "  -0.0304107666015625,\n",
              "  0.07354736328125,\n",
              "  0.0928955078125,\n",
              "  -0.0189208984375,\n",
              "  0.0218963623046875,\n",
              "  -0.002536773681640625,\n",
              "  -0.0180511474609375,\n",
              "  0.03607177734375,\n",
              "  -0.0260009765625,\n",
              "  -0.03564453125,\n",
              "  -0.0157012939453125,\n",
              "  0.043975830078125,\n",
              "  -0.010345458984375,\n",
              "  -0.0201416015625,\n",
              "  -0.054595947265625,\n",
              "  0.005184173583984375,\n",
              "  -0.04925537109375,\n",
              "  0.00624847412109375,\n",
              "  0.04608154296875,\n",
              "  -0.048675537109375,\n",
              "  -0.0316162109375,\n",
              "  -0.009796142578125,\n",
              "  0.0251007080078125,\n",
              "  -0.020904541015625,\n",
              "  -0.0118255615234375,\n",
              "  0.04052734375,\n",
              "  -0.006160736083984375,\n",
              "  -0.0478515625,\n",
              "  -0.04876708984375,\n",
              "  0.011505126953125,\n",
              "  -0.030670166015625,\n",
              "  0.0718994140625,\n",
              "  -0.033477783203125,\n",
              "  -0.0267181396484375,\n",
              "  -0.0615234375,\n",
              "  0.0228424072265625,\n",
              "  -0.0269775390625,\n",
              "  0.00011092424392700195,\n",
              "  0.055023193359375,\n",
              "  0.0039520263671875,\n",
              "  -0.00814056396484375,\n",
              "  -0.0345458984375,\n",
              "  0.022857666015625,\n",
              "  -0.0113372802734375,\n",
              "  0.0235443115234375,\n",
              "  -0.035736083984375,\n",
              "  -0.032470703125,\n",
              "  0.0226898193359375,\n",
              "  -0.0011301040649414062,\n",
              "  0.03631591796875,\n",
              "  -0.02593994140625,\n",
              "  0.03692626953125,\n",
              "  -0.0005440711975097656,\n",
              "  0.0262603759765625,\n",
              "  -0.07421875,\n",
              "  -0.007305145263671875,\n",
              "  1.7881393432617188e-07,\n",
              "  -0.0258941650390625,\n",
              "  -0.0204010009765625,\n",
              "  -0.052032470703125,\n",
              "  -0.048828125,\n",
              "  0.02154541015625,\n",
              "  0.01355743408203125,\n",
              "  -0.051910400390625,\n",
              "  0.037628173828125,\n",
              "  0.02532958984375,\n",
              "  0.0235137939453125,\n",
              "  0.0182647705078125,\n",
              "  0.00873565673828125,\n",
              "  0.0312347412109375,\n",
              "  0.0601806640625,\n",
              "  -0.017669677734375,\n",
              "  0.01235198974609375,\n",
              "  0.00439453125,\n",
              "  0.0121002197265625,\n",
              "  -0.004058837890625,\n",
              "  0.0107421875,\n",
              "  -0.0185089111328125,\n",
              "  0.0277252197265625,\n",
              "  0.0325927734375,\n",
              "  0.020233154296875,\n",
              "  0.038360595703125,\n",
              "  -0.0628662109375,\n",
              "  -0.0109405517578125,\n",
              "  -0.046112060546875,\n",
              "  -0.032958984375,\n",
              "  0.00852203369140625,\n",
              "  0.05975341796875,\n",
              "  0.0185394287109375,\n",
              "  0.016845703125,\n",
              "  -0.01561737060546875,\n",
              "  -0.0164031982421875,\n",
              "  0.003948211669921875,\n",
              "  -0.00644683837890625,\n",
              "  -0.07550048828125,\n",
              "  -0.005153656005859375,\n",
              "  -0.03863525390625,\n",
              "  0.03851318359375,\n",
              "  -0.0007991790771484375,\n",
              "  0.040283203125,\n",
              "  0.006473541259765625,\n",
              "  -0.0048828125,\n",
              "  0.03570556640625,\n",
              "  -0.04327392578125,\n",
              "  -0.03314208984375,\n",
              "  -0.022979736328125,\n",
              "  0.025909423828125,\n",
              "  0.036529541015625,\n",
              "  0.044036865234375,\n",
              "  0.02947998046875,\n",
              "  0.00814056396484375,\n",
              "  -0.033203125,\n",
              "  0.0234527587890625,\n",
              "  0.04833984375,\n",
              "  -0.0243072509765625,\n",
              "  -0.038909912109375,\n",
              "  -0.0134124755859375,\n",
              "  -0.018280029296875,\n",
              "  -0.030853271484375,\n",
              "  0.01061248779296875,\n",
              "  -0.04254150390625,\n",
              "  -0.0038509368896484375,\n",
              "  -0.00238800048828125,\n",
              "  0.006587982177734375,\n",
              "  0.04046630859375,\n",
              "  -0.0214080810546875,\n",
              "  0.004405975341796875,\n",
              "  0.0057830810546875,\n",
              "  0.0181427001953125,\n",
              "  -0.0078887939453125,\n",
              "  -0.0250396728515625,\n",
              "  0.0159454345703125,\n",
              "  0.0233917236328125,\n",
              "  0.036468505859375,\n",
              "  0.0267333984375,\n",
              "  -0.044189453125,\n",
              "  0.045196533203125,\n",
              "  -0.01300811767578125,\n",
              "  -0.026031494140625,\n",
              "  0.033660888671875,\n",
              "  -0.0212249755859375,\n",
              "  -0.021575927734375,\n",
              "  -0.047119140625,\n",
              "  0.043182373046875,\n",
              "  -0.01861572265625,\n",
              "  0.014373779296875,\n",
              "  -0.00925445556640625,\n",
              "  -0.016326904296875,\n",
              "  -0.005878448486328125,\n",
              "  -0.00684356689453125,\n",
              "  -0.03802490234375,\n",
              "  -0.00640869140625,\n",
              "  -0.01271820068359375,\n",
              "  0.0301513671875,\n",
              "  -0.0010423660278320312,\n",
              "  -0.0491943359375,\n",
              "  0.00024235248565673828,\n",
              "  -0.0025501251220703125,\n",
              "  0.01122283935546875,\n",
              "  0.0208587646484375,\n",
              "  0.01190185546875,\n",
              "  -0.004703521728515625,\n",
              "  0.0187835693359375,\n",
              "  -0.0239410400390625,\n",
              "  -0.0295867919921875,\n",
              "  -0.0245819091796875,\n",
              "  -0.0496826171875,\n",
              "  -0.0181427001953125,\n",
              "  -0.0208587646484375,\n",
              "  0.013885498046875,\n",
              "  0.026702880859375,\n",
              "  0.013702392578125,\n",
              "  -0.034423828125,\n",
              "  0.0330810546875,\n",
              "  -0.0197906494140625,\n",
              "  -0.05670166015625,\n",
              "  0.031646728515625,\n",
              "  -0.03070068359375,\n",
              "  -0.01415252685546875,\n",
              "  -0.0171661376953125,\n",
              "  0.055877685546875,\n",
              "  -0.0204925537109375,\n",
              "  -0.0022869110107421875,\n",
              "  -0.0286102294921875,\n",
              "  0.033416748046875,\n",
              "  -0.0767822265625,\n",
              "  0.0390625,\n",
              "  0.03216552734375,\n",
              "  -0.0121917724609375,\n",
              "  -0.0208282470703125,\n",
              "  -0.0223236083984375,\n",
              "  0.0543212890625,\n",
              "  -0.00798797607421875,\n",
              "  0.039703369140625,\n",
              "  -0.007053375244140625,\n",
              "  -0.020751953125,\n",
              "  0.0005011558532714844,\n",
              "  -0.0117340087890625,\n",
              "  -0.04583740234375,\n",
              "  0.00305938720703125,\n",
              "  -0.01157379150390625,\n",
              "  -0.016082763671875,\n",
              "  0.0006213188171386719,\n",
              "  -0.02191162109375,\n",
              "  0.0281219482421875,\n",
              "  -0.029571533203125,\n",
              "  -0.057464599609375,\n",
              "  0.02655029296875,\n",
              "  0.0038604736328125,\n",
              "  0.02777099609375,\n",
              "  0.01690673828125,\n",
              "  -0.015228271484375,\n",
              "  0.0599365234375,\n",
              "  0.022125244140625,\n",
              "  -0.00337982177734375,\n",
              "  -0.0034618377685546875,\n",
              "  -0.0307769775390625,\n",
              "  0.012847900390625,\n",
              "  0.00415802001953125,\n",
              "  -0.05047607421875,\n",
              "  -0.000827789306640625,\n",
              "  0.00235748291015625,\n",
              "  -0.0175628662109375,\n",
              "  0.00577545166015625,\n",
              "  0.0005931854248046875,\n",
              "  0.007480621337890625,\n",
              "  -0.037109375,\n",
              "  0.043609619140625,\n",
              "  0.006969451904296875,\n",
              "  0.00554656982421875,\n",
              "  0.0511474609375,\n",
              "  0.035308837890625,\n",
              "  0.0330810546875,\n",
              "  0.01248931884765625,\n",
              "  0.0179290771484375,\n",
              "  0.0086822509765625,\n",
              "  -0.018798828125,\n",
              "  0.03106689453125,\n",
              "  0.0147705078125,\n",
              "  0.05303955078125,\n",
              "  -0.0174713134765625,\n",
              "  0.024261474609375,\n",
              "  -0.0151519775390625,\n",
              "  0.0022144317626953125,\n",
              "  -0.032318115234375,\n",
              "  0.0184173583984375,\n",
              "  -0.05181884765625,\n",
              "  0.039154052734375,\n",
              "  0.0172119140625,\n",
              "  -0.031524658203125,\n",
              "  -0.00989532470703125,\n",
              "  -0.0222320556640625,\n",
              "  -0.02996826171875,\n",
              "  -0.010040283203125,\n",
              "  0.0041351318359375,\n",
              "  -0.003780364990234375,\n",
              "  0.06109619140625,\n",
              "  -0.0190887451171875,\n",
              "  -0.0325927734375,\n",
              "  -0.0304107666015625,\n",
              "  -0.03521728515625,\n",
              "  -0.021728515625,\n",
              "  -0.033294677734375,\n",
              "  0.043853759765625,\n",
              "  0.0472412109375,\n",
              "  -0.0203704833984375,\n",
              "  -0.018402099609375,\n",
              "  0.0513916015625,\n",
              "  0.018951416015625,\n",
              "  0.01617431640625,\n",
              "  0.0198822021484375,\n",
              "  -0.023681640625,\n",
              "  0.0313720703125,\n",
              "  0.021087646484375,\n",
              "  ...]}"
            ]
          },
          "execution_count": 21,
          "metadata": {},
          "output_type": "execute_result"
        }
      ],
      "source": [
        "final_ds[0]"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "xLt7W0oZrqkU"
      },
      "source": [
        "## Searching in English and Spanish for similar sentences\n",
        "\n"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "O1dRaYmRrqkU"
      },
      "source": [
        "Now that we have upserted our data, we can use the same pipeline to embed queries to the model and search!\n",
        "\n",
        "We'll do English-English search as a way to understand the output first."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "L0Utj2n_rqkU",
        "outputId": "69389ef0-1af7-4807-8c0f-980fdb55e68d"
      },
      "outputs": [
        {
          "data": {
            "text/plain": [
              "{'matches': [{'id': '413',\n",
              "              'metadata': {'lang': 'en', 'text': 'I am playing in the park.'},\n",
              "              'score': 0.833629489,\n",
              "              'values': []},\n",
              "             {'id': '368',\n",
              "              'metadata': {'lang': 'en',\n",
              "                           'text': 'We were playing in the park.'},\n",
              "              'score': 0.821107924,\n",
              "              'values': []},\n",
              "             {'id': '156',\n",
              "              'metadata': {'lang': 'en',\n",
              "                           'text': 'We were playing in the park.'},\n",
              "              'score': 0.821107924,\n",
              "              'values': []},\n",
              "             {'id': '122',\n",
              "              'metadata': {'lang': 'en',\n",
              "                           'text': 'It was fun playing in the park.'},\n",
              "              'score': 0.820462883,\n",
              "              'values': []},\n",
              "             {'id': '389',\n",
              "              'metadata': {'lang': 'en', 'text': 'The kids play in the park.'},\n",
              "              'score': 0.810867846,\n",
              "              'values': []},\n",
              "             {'id': '136',\n",
              "              'metadata': {'lang': 'en',\n",
              "                           'text': 'They were playing baseball in the park.'},\n",
              "              'score': 0.810379803,\n",
              "              'values': []},\n",
              "             {'id': '90',\n",
              "              'metadata': {'lang': 'en',\n",
              "                           'text': 'They were playing baseball in the park.'},\n",
              "              'score': 0.810379803,\n",
              "              'values': []},\n",
              "             {'id': '62',\n",
              "              'metadata': {'lang': 'en',\n",
              "                           'text': 'We went to the park to play.'},\n",
              "              'score': 0.807442307,\n",
              "              'values': []},\n",
              "             {'id': '245',\n",
              "              'metadata': {'lang': 'en',\n",
              "                           'text': 'We went to the park to play.'},\n",
              "              'score': 0.807442307,\n",
              "              'values': []},\n",
              "             {'id': '139',\n",
              "              'metadata': {'lang': 'en',\n",
              "                           'text': 'There was a group of children playing in '\n",
              "                                   'the park.'},\n",
              "              'score': 0.807257056,\n",
              "              'values': []}],\n",
              " 'namespace': '',\n",
              " 'usage': {'read_units': 6}}"
            ]
          },
          "execution_count": 22,
          "metadata": {},
          "output_type": "execute_result"
        }
      ],
      "source": [
        "\n",
        "def embed_query(sentence):\n",
        "  # this time, we embed the queries as \"query\" type\n",
        "\n",
        "    embeddings = pc.inference.embed(\n",
        "        model = \"multilingual-e5-large\",\n",
        "        # must be passed as a list\n",
        "        inputs = [sentence],\n",
        "        parameters = {\"input_type\": \"query\"}\n",
        "    )\n",
        "\n",
        "    # this allows us to pull out the vector values\n",
        "    return embeddings.data[0][\"values\"]\n",
        "\n",
        "\n",
        "query_vector = embed_query(\"playing a sport at the park\")\n",
        "\n",
        "\n",
        "# We filter on \"en\" to search only over the English Sentences\n",
        "index.query(\n",
        "    vector=query_vector,\n",
        "    filter= {\n",
        "      \"lang\": {\"$eq\": \"en\"}\n",
        "    },\n",
        "    include_metadata=True,\n",
        "    top_k=10)\n"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "ml0w_kl0rqkU"
      },
      "source": [
        "**Switching to Spanish queries requires no additional code**!\n",
        "\n",
        "Here, we search \"I play in the park at the end of the week\".\n",
        "\n",
        "And, notice the results have no exact overlap with the translated phrase!"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "lZ9atiSmrqkU",
        "outputId": "bbd95d1a-0edb-45d9-8385-6d8d9a78a128"
      },
      "outputs": [
        {
          "data": {
            "text/plain": [
              "{'matches': [{'id': '78',\n",
              "              'metadata': {'lang': 'en',\n",
              "                           'text': 'I went to the park last Sunday.'},\n",
              "              'score': 0.824530661,\n",
              "              'values': []},\n",
              "             {'id': '32',\n",
              "              'metadata': {'lang': 'en',\n",
              "                           'text': 'I went to the park last Saturday.'},\n",
              "              'score': 0.822012424,\n",
              "              'values': []},\n",
              "             {'id': '283',\n",
              "              'metadata': {'lang': 'en',\n",
              "                           'text': 'On Saturdays, we usually visit in this '\n",
              "                                   'park.'},\n",
              "              'score': 0.81381613,\n",
              "              'values': []},\n",
              "             {'id': '192',\n",
              "              'metadata': {'lang': 'en', 'text': 'What happened in the park?'},\n",
              "              'score': 0.812519372,\n",
              "              'values': []},\n",
              "             {'id': '19',\n",
              "              'metadata': {'lang': 'en',\n",
              "                           'text': \"It's a good day for going to the park.\"},\n",
              "              'score': 0.80653882,\n",
              "              'values': []},\n",
              "             {'id': '24',\n",
              "              'metadata': {'lang': 'en',\n",
              "                           'text': 'That park is full of amusements.'},\n",
              "              'score': 0.805540085,\n",
              "              'values': []},\n",
              "             {'id': '344',\n",
              "              'metadata': {'lang': 'en',\n",
              "                           'text': \"Near my house, there's a park.\"},\n",
              "              'score': 0.803945601,\n",
              "              'values': []},\n",
              "             {'id': '260',\n",
              "              'metadata': {'lang': 'en',\n",
              "                           'text': 'Something very unusual seems to be '\n",
              "                                   'happening in the park.'},\n",
              "              'score': 0.803942144,\n",
              "              'values': []},\n",
              "             {'id': '122',\n",
              "              'metadata': {'lang': 'en',\n",
              "                           'text': 'It was fun playing in the park.'},\n",
              "              'score': 0.803871155,\n",
              "              'values': []},\n",
              "             {'id': '124',\n",
              "              'metadata': {'lang': 'en',\n",
              "                           'text': 'He may be jogging around the park.'},\n",
              "              'score': 0.803005636,\n",
              "              'values': []}],\n",
              " 'namespace': '',\n",
              " 'usage': {'read_units': 6}}"
            ]
          },
          "execution_count": 23,
          "metadata": {},
          "output_type": "execute_result"
        }
      ],
      "source": [
        "query_vector = embed_query(\"Jugo en el parque en la fin de semana\")\n",
        "\n",
        "index.query(\n",
        "    vector=query_vector,\n",
        "    filter = {\n",
        "        \"lang\": \"en\"\n",
        "    },\n",
        "    include_metadata=True,\n",
        "    top_k=10)\n"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "IABlwrOcrqkV"
      },
      "source": [
        "## Bringing it full circle: English-Spanish search\n",
        "\n",
        "\n"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "C-WWgin1rqkV"
      },
      "source": [
        "Finally, we can do our initial task: learning new Spanish sentences with English! We use the same embedding and preprocessing pipeline, taking care to specify the metadata to return semantically similar results in spanish"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "J6QeVstirqkV",
        "outputId": "79587d4d-d0ed-4416-dc93-a15c7478d42d"
      },
      "outputs": [
        {
          "data": {
            "text/plain": [
              "{'matches': [{'id': '448',\n",
              "              'metadata': {'lang': 'es',\n",
              "                           'text': 'Fui al parque el s\u00e1bado pasado.'},\n",
              "              'score': 0.835433,\n",
              "              'values': []},\n",
              "             {'id': '494',\n",
              "              'metadata': {'lang': 'es',\n",
              "                           'text': 'Fui al parque el domingo pasado.'},\n",
              "              'score': 0.826080561,\n",
              "              'values': []},\n",
              "             {'id': '511',\n",
              "              'metadata': {'lang': 'es',\n",
              "                           'text': 'Fui al parque a darme un paseo.'},\n",
              "              'score': 0.82015,\n",
              "              'values': []},\n",
              "             {'id': '703',\n",
              "              'metadata': {'lang': 'es', 'text': 'Ayer fui al parque.'},\n",
              "              'score': 0.812328279,\n",
              "              'values': []},\n",
              "             {'id': '809',\n",
              "              'metadata': {'lang': 'es',\n",
              "                           'text': 'Ayer fuiste al parque, \u00bfverdad?'},\n",
              "              'score': 0.807226837,\n",
              "              'values': []},\n",
              "             {'id': '446',\n",
              "              'metadata': {'lang': 'es',\n",
              "                           'text': 'Mi hermana suele ir al parque los fines de '\n",
              "                                   'semana.'},\n",
              "              'score': 0.800018191,\n",
              "              'values': []},\n",
              "             {'id': '661',\n",
              "              'metadata': {'lang': 'es', 'text': 'Fuimos al parque a jugar.'},\n",
              "              'score': 0.799953163,\n",
              "              'values': []},\n",
              "             {'id': '793',\n",
              "              'metadata': {'lang': 'es', 'text': 'Lo vi en el parque.'},\n",
              "              'score': 0.799875677,\n",
              "              'values': []},\n",
              "             {'id': '790',\n",
              "              'metadata': {'lang': 'es',\n",
              "                           'text': 'Fui al parque a jugar al tenis.'},\n",
              "              'score': 0.798786104,\n",
              "              'values': []},\n",
              "             {'id': '604',\n",
              "              'metadata': {'lang': 'es',\n",
              "                           'text': 'Mi hermana normalmente va al parque los '\n",
              "                                   'fines de semana.'},\n",
              "              'score': 0.798756123,\n",
              "              'values': []}],\n",
              " 'namespace': '',\n",
              " 'usage': {'read_units': 6}}"
            ]
          },
          "execution_count": 24,
          "metadata": {},
          "output_type": "execute_result"
        }
      ],
      "source": [
        "query_vector = embed_query(\"I went to the park last weekend to play sports\")\n",
        "\n",
        "\n",
        "index.query(\n",
        "    vector=query_vector,\n",
        "    filter = {\n",
        "        \"lang\":\"es\"\n",
        "    },\n",
        "    include_metadata=True,\n",
        "    top_k=10)\n"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "ts2w92ql5kZx"
      },
      "source": [
        "And just to confirm, let's see what happens when we use the other meaning of \"park\"!"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "TwEWw52vtiNR",
        "outputId": "c3889095-38ad-4248-ba82-614ffd410106"
      },
      "outputs": [
        {
          "data": {
            "text/plain": [
              "{'matches': [{'id': '493',\n",
              "              'metadata': {'lang': 'es', 'text': '\u00bfD\u00f3nde puedo aparcar?'},\n",
              "              'score': 0.840159774,\n",
              "              'values': []},\n",
              "             {'id': '649',\n",
              "              'metadata': {'lang': 'es',\n",
              "                           'text': 'No encuentro un lugar para estacionar mi '\n",
              "                                   'nave espacial.'},\n",
              "              'score': 0.838665545,\n",
              "              'values': []},\n",
              "             {'id': '645',\n",
              "              'metadata': {'lang': 'es', 'text': '\u00bfD\u00f3nde puedo estacionar?'},\n",
              "              'score': 0.837124407,\n",
              "              'values': []},\n",
              "             {'id': '620',\n",
              "              'metadata': {'lang': 'es',\n",
              "                           'text': 'No encuentro d\u00f3nde parquear mi nave '\n",
              "                                   'espacial.'},\n",
              "              'score': 0.836103082,\n",
              "              'values': []},\n",
              "             {'id': '570',\n",
              "              'metadata': {'lang': 'es',\n",
              "                           'text': 'Tuvo un problema al estacionar.'},\n",
              "              'score': 0.833392,\n",
              "              'values': []},\n",
              "             {'id': '743',\n",
              "              'metadata': {'lang': 'es',\n",
              "                           'text': 'Nos pasamos una eternidad buscando '\n",
              "                                   'estacionamiento.'},\n",
              "              'score': 0.83296752,\n",
              "              'values': []},\n",
              "             {'id': '618',\n",
              "              'metadata': {'lang': 'es',\n",
              "                           'text': '\u00bfD\u00f3nde puedo estacionar mi carro?'},\n",
              "              'score': 0.832145393,\n",
              "              'values': []},\n",
              "             {'id': '460',\n",
              "              'metadata': {'lang': 'es',\n",
              "                           'text': '\u00bfD\u00f3nde podemos aparcar el coche?'},\n",
              "              'score': 0.828689933,\n",
              "              'values': []},\n",
              "             {'id': '471',\n",
              "              'metadata': {'lang': 'es',\n",
              "                           'text': '\u00bfPuedo aparcar aqu\u00ed mi coche?'},\n",
              "              'score': 0.828009188,\n",
              "              'values': []},\n",
              "             {'id': '796',\n",
              "              'metadata': {'lang': 'es',\n",
              "                           'text': '\u00bfPuedo aparcar aqu\u00ed mi coche?'},\n",
              "              'score': 0.828009188,\n",
              "              'values': []}],\n",
              " 'namespace': '',\n",
              " 'usage': {'read_units': 6}}"
            ]
          },
          "execution_count": 25,
          "metadata": {},
          "output_type": "execute_result"
        }
      ],
      "source": [
        "query_vector = embed_query(\"I need to find a place to park\")\n",
        "\n",
        "\n",
        "index.query(\n",
        "    vector=query_vector,\n",
        "    filter={\n",
        "        \"lang\":\"es\"\n",
        "    },\n",
        "    include_metadata=True,\n",
        "    top_k=10)"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "BRdiGS5-5tbd"
      },
      "source": [
        "The result contains sentences about parking cars, which is much different than our prior query!"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "ccXqRJ6crqkV"
      },
      "source": [
        "As we can see, there is no translation happening here, as the only English sentence being used is the query. Somehow, the embedding model captures the meaning of these sentences across languages. That's magical!\n",
        "\n",
        "Try doing the following next:\n",
        "- expanding the search over a larger subset of the data (be sure to be mindful of embedding and upsertion rate limits!\n",
        "- test with a different language pair\n",
        "- incoporating more languages within the same namespace, and see what turns up!\n",
        "\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "VHVLB23osraj"
      },
      "outputs": [],
      "source": [
        "pc.delete_index(\"tatoeba-semantic-search\")"
      ]
    }
  ],
  "metadata": {
    "colab": {
      "provenance": []
    },
    "kernelspec": {
      "display_name": "pinecone_torrey",
      "language": "python",
      "name": "python3"
    },
    "language_info": {
      "codemirror_mode": {
        "name": "ipython",
        "version": 3
      },
      "file_extension": ".py",
      "mimetype": "text/x-python",
      "name": "python",
      "nbconvert_exporter": "python",
      "pygments_lexer": "ipython3",
      "version": "3.11.9"
    }
  },
  "nbformat": 4,
  "nbformat_minor": 0
}