{
 "cells": [
  {
   "cell_type": "code",
   "execution_count": 1,
   "metadata": {
    "colab": {
     "base_uri": "https://localhost:8080/"
    },
    "executionInfo": {
     "elapsed": 3527,
     "status": "ok",
     "timestamp": 1648197799884,
     "user": {
      "displayName": "Vincent D. Warmerdam",
      "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gh4KYzhhhK0YDTnAQsUIaQPw-0dKIP-kLBID7nFdQ=s64",
      "userId": "05641618555626735638"
     },
     "user_tz": -60
    },
    "id": "ssfd1qsSxtRS",
    "outputId": "7c6e6585-2362-4be2-da05-db00a0307fe6",
    "tags": []
   },
   "outputs": [],
   "source": [
    "%pip install mmh3 numpy"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## The Bloom embeddings algorithm\n",
    "\n",
    "In a normal embedding table, each word-string is mapped to a distinct ID.\n",
    "Usually these IDs will be sequential, so if you have a vocabulary of 100 words,\n",
    "your words will be mapped to numbers `range(100)`. The sequential IDs can then\n",
    "be used as indices into an embedding table: if you have 100 words in your\n",
    "vocabulary, you have 100 rows in the table, and each word receives its own\n",
    "vector.\n",
    "\n",
    "However, there's no limit to the number of unique words that might occur in a\n",
    "sample of text, while we definitely want a limited number of rows in our\n",
    "embedding table. Some of the rows in our table will therefore need to be shared\n",
    "between multiple words in our vocabulary. One obvious solution is to set aside a\n",
    "single vector in the table. Words 0-98 will each receive their own vector, while\n",
    "all other words are assigned to vector 99.\n",
    "\n",
    "However, this asks vector 99 to do a lot of work. What if we gave more vectors\n",
    "to the unknown words?"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "metadata": {
    "executionInfo": {
     "elapsed": 10,
     "status": "ok",
     "timestamp": 1648197799885,
     "user": {
      "displayName": "Vincent D. Warmerdam",
      "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gh4KYzhhhK0YDTnAQsUIaQPw-0dKIP-kLBID7nFdQ=s64",
      "userId": "05641618555626735638"
     },
     "user_tz": -60
    },
    "id": "Eb895XpR-VUB"
   },
   "outputs": [],
   "source": [
    "def get_row(word_id, number_vector=100, number_oov=10):\n",
    "    if word_id < (number_vector - number_oov):\n",
    "        return word_id\n",
    "    else:\n",
    "        return number_vector + (word_id % number_oov)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "This gives the model a little more resolution for the unknown words. If all\n",
    "out-of-vocabulary words are assigned the same vector, then they'll all look\n",
    "identical to the model. Even if the training data actually includes information\n",
    "that shows two different out-of-vocabulary words have important, different\n",
    "implications -- for instance, if one word is a strong indicator of positive\n",
    "sentiment, while the other is a strong indicator of negative sentiment -- the\n",
    "model won't be able to tell them apart. However, if we have 10 buckets for the\n",
    "unknown words, we might get lucky, and assign these words to different buckets.\n",
    "If so, the model would be able to learn that one of the unknown-word vectors\n",
    "makes positive sentiment more likely, while the other vector makes negative\n",
    "sentiment more likely.\n",
    "\n",
    "If this is good, then why not do more of it? Bloom embeddings are like an\n",
    "extreme version, where _every_ word is handled like the unknown words above:\n",
    "there are 100 vectors for the \"unknown\" portion, and 0 for the \"known\" portion.\n",
    "\n",
    "So far, this approach seems weird, but not necessarily good. The part that makes\n",
    "it unfairly effective is the next step: by simply doing the same thing multiple\n",
    "times, we can greatly improve the resolution, and have unique representations\n",
    "for far more words than we have vectors. The code in full:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "metadata": {
    "executionInfo": {
     "elapsed": 8,
     "status": "ok",
     "timestamp": 1648197799885,
     "user": {
      "displayName": "Vincent D. Warmerdam",
      "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gh4KYzhhhK0YDTnAQsUIaQPw-0dKIP-kLBID7nFdQ=s64",
      "userId": "05641618555626735638"
     },
     "user_tz": -60
    },
    "id": "tTkqM3EixhWM"
   },
   "outputs": [],
   "source": [
    "import numpy\n",
    "import mmh3\n",
    "\n",
    "def allocate(n_vectors, n_dimensions):\n",
    "    table = numpy.zeros((n_vectors, n_dimensions), dtype='f')\n",
    "    table += numpy.random.uniform(-0.1, 0.1, table.size).reshape(table.shape)\n",
    "    return table\n",
    "\n",
    "def get_vector(table, word):\n",
    "    hash1 = mmh3.hash(word, seed=0)\n",
    "    hash2 = mmh3.hash(word, seed=1)\n",
    "    row1 = hash1 % table.shape[0]\n",
    "    row2 = hash2 % table.shape[0]\n",
    "    return table[row1] + table[row2]\n",
    "\n",
    "def update_vector(table, word, d_vector):\n",
    "    hash1 = mmh3.hash(word, seed=0)\n",
    "    hash2 = mmh3.hash(word, seed=1)\n",
    "    row1 = hash1 % table.shape[0]\n",
    "    row2 = hash2 % table.shape[0]\n",
    "    table[row1] -= 0.001 * d_vector\n",
    "    table[row2] -= 0.001 * d_vector"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "In this example, we've used two keys, assigned from two random hash functions.\n",
    "It's unlikely that two words will collide on both keys, so by simply summing the\n",
    "vectors together, we'll assign most words a unique representation.\n",
    "\n",
    "For the sake of illustration, let's step through a very small example,\n",
    "explicitly.\n",
    "\n",
    "Let's say we have this vocabulary of 20 words:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "metadata": {
    "executionInfo": {
     "elapsed": 8,
     "status": "ok",
     "timestamp": 1648197799885,
     "user": {
      "displayName": "Vincent D. Warmerdam",
      "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gh4KYzhhhK0YDTnAQsUIaQPw-0dKIP-kLBID7nFdQ=s64",
      "userId": "05641618555626735638"
     },
     "user_tz": -60
    },
    "id": "QMaz-mr9xjPG"
   },
   "outputs": [],
   "source": [
    "vocab = ['apple', 'strawberry', 'orange', 'juice', 'drink', 'smoothie',\n",
    "         'eat', 'fruit', 'health', 'wellness', 'steak', 'fries', 'ketchup',\n",
    "         'burger', 'chips', 'lobster', 'caviar', 'service', 'waiter', 'chef']"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "We'll embed these into two dimensions. Normally this would give us a table of\n",
    "`(20, 2)` floats, which we would randomly initialise. With the hashing trick, we\n",
    "can make the table smaller. Let's give it 15 vectors:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "metadata": {
    "executionInfo": {
     "elapsed": 8,
     "status": "ok",
     "timestamp": 1648197799886,
     "user": {
      "displayName": "Vincent D. Warmerdam",
      "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gh4KYzhhhK0YDTnAQsUIaQPw-0dKIP-kLBID7nFdQ=s64",
      "userId": "05641618555626735638"
     },
     "user_tz": -60
    },
    "id": "LNg60lvqxkmP"
   },
   "outputs": [],
   "source": [
    "normal_embed = numpy.random.uniform(-0.1, 0.1, (20, 2))\n",
    "hashed_embed = numpy.random.uniform(-0.1, 0.1, (15, 2))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "In the normal table, we want to map each word in our vocabulary to its own\n",
    "vector:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "metadata": {
    "executionInfo": {
     "elapsed": 5,
     "status": "ok",
     "timestamp": 1648197801914,
     "user": {
      "displayName": "Vincent D. Warmerdam",
      "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gh4KYzhhhK0YDTnAQsUIaQPw-0dKIP-kLBID7nFdQ=s64",
      "userId": "05641618555626735638"
     },
     "user_tz": -60
    },
    "id": "9wFC_iR_xlBH"
   },
   "outputs": [],
   "source": [
    "word2id = {}\n",
    "def get_normal_vector(word, table):\n",
    "    if word not in word2id.keys():\n",
    "        word2id[word] = len(word2id)\n",
    "    return table[word2id[word]]"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The hashed table only has 15 rows, so some words will have to share. We'll\n",
    "handle this by mapping the word into an arbitrary integer – called a \"hash\n",
    "value\". The hash function will return an arbitrary integer, which we'll mod into\n",
    "the range `(0, 15)`. Importantly, we need to be able to compute _multiple,\n",
    "distinct_ hash values for each key – so Python's built-in hash function is\n",
    "inconvenient. We'll therefore use MurmurHash.\n",
    "\n",
    "Let's see what keys we get for our 20 vocabulary items, using MurmurHash:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "metadata": {
    "executionInfo": {
     "elapsed": 4,
     "status": "ok",
     "timestamp": 1648197804508,
     "user": {
      "displayName": "Vincent D. Warmerdam",
      "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gh4KYzhhhK0YDTnAQsUIaQPw-0dKIP-kLBID7nFdQ=s64",
      "userId": "05641618555626735638"
     },
     "user_tz": -60
    },
    "id": "Gs69d2KRxmg9"
   },
   "outputs": [],
   "source": [
    "hashes1 = [mmh3.hash(w, 1) % 15 for w in vocab]\n",
    "assert hashes1 == [3, 6, 4, 13, 8, 3, 13, 1, 9, 12, 11, 4, 2, 13, 5, 10, 0, 2, 10, 13]"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "As you can see, some keys are shared between multiple words, while 2/15 keys are\n",
    "unoccupied. This is obviously unideal! If multiple words have the same key,\n",
    "they'll map to the same vector – as far as the model is concerned, \"strawberry\"\n",
    "and \"heart\" will be indistinguishable. It won't be clear which word was used –\n",
    "they have the same representation.\n",
    "\n",
    "To address this, we simply hash the words again, this time using a different\n",
    "seed – so that we get a different set of arbitrary keys:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "metadata": {
    "executionInfo": {
     "elapsed": 3,
     "status": "ok",
     "timestamp": 1648197804508,
     "user": {
      "displayName": "Vincent D. Warmerdam",
      "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gh4KYzhhhK0YDTnAQsUIaQPw-0dKIP-kLBID7nFdQ=s64",
      "userId": "05641618555626735638"
     },
     "user_tz": -60
    },
    "id": "acpOxkljynPo"
   },
   "outputs": [],
   "source": [
    "from collections import Counter\n",
    "\n",
    "hashes2 = [mmh3.hash(w, 2) % 15 for w in vocab]\n",
    "assert len(Counter(hashes2).most_common()) == 12"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "This one's even worse – 3 keys unoccupied! But our strategy is not to keep drawing until we get a favorable seed. Instead, consider this:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 9,
   "metadata": {
    "executionInfo": {
     "elapsed": 3,
     "status": "ok",
     "timestamp": 1648197805024,
     "user": {
      "displayName": "Vincent D. Warmerdam",
      "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gh4KYzhhhK0YDTnAQsUIaQPw-0dKIP-kLBID7nFdQ=s64",
      "userId": "05641618555626735638"
     },
     "user_tz": -60
    },
    "id": "W7tfxLQBytWP"
   },
   "outputs": [],
   "source": [
    "assert len(Counter(zip(hashes1, hashes2))) == 20"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "By combining the results from the two hashes, our 20 words distribute perfectly,\n",
    "into 20 unique combinations. This makes sense: we expect to have some words\n",
    "overlapping on one of the keys, but we'd have to be very unlucky for a pair of\n",
    "words to overlap on _both_ keys.\n",
    "\n",
    "This means that if we simply add the two vectors together, each word once more\n",
    "has a unique representation:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 10,
   "metadata": {
    "colab": {
     "base_uri": "https://localhost:8080/"
    },
    "executionInfo": {
     "elapsed": 2,
     "status": "ok",
     "timestamp": 1648197805764,
     "user": {
      "displayName": "Vincent D. Warmerdam",
      "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gh4KYzhhhK0YDTnAQsUIaQPw-0dKIP-kLBID7nFdQ=s64",
      "userId": "05641618555626735638"
     },
     "user_tz": -60
    },
    "id": "wI5yayZWyxVP",
    "outputId": "4f62b77d-709f-483b-a0bb-4e5fbe68d5fe"
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "apple -0.033 -0.012\n",
      "strawberry -0.023 -0.037\n",
      "orange 0.158 -0.031\n",
      "juice -0.045 0.139\n",
      "drink 0.024 0.030\n",
      "smoothie 0.121 0.076\n",
      "eat -0.093 0.153\n",
      "fruit 0.083 0.052\n",
      "health 0.064 -0.046\n",
      "wellness 0.143 0.112\n",
      "steak 0.011 -0.097\n",
      "fries 0.036 0.041\n",
      "ketchup 0.081 0.029\n",
      "burger -0.045 0.139\n",
      "chips -0.118 -0.090\n",
      "lobster 0.016 -0.107\n",
      "caviar -0.033 -0.012\n",
      "service 0.081 0.029\n",
      "waiter 0.179 -0.038\n",
      "chef -0.047 0.062\n"
     ]
    }
   ],
   "source": [
    "for word in vocab:\n",
    "    key1 = mmh3.hash(word, 0) % 15\n",
    "    key2 = mmh3.hash(word, 1) % 15\n",
    "    vector = hashed_embed[key1] + hashed_embed[key2]\n",
    "    print(word, '%.3f %.3f' % tuple(vector))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "We now have a function that maps our 20 words to 20 unique vectors – but we're\n",
    "storing weights for only 15 vectors in memory. Now the question is: will we be\n",
    "able to find values for these weights that let us actually map words to useful\n",
    "vectors?\n",
    "\n",
    "Let's do a quick experiment to see how this works. We'll assign \"true\" values\n",
    "for our little vocabulary, and see how well we can approximate them with our\n",
    "compressed table. To get the \"true\" values, we _could_ put the \"science\" in data\n",
    "science, and drag the words around into reasonable-looking clusters. But for our\n",
    "purposes, the actual \"true\" values don't matter. We'll therefore just do a\n",
    "simulation: we'll assign random vectors as the \"true\" state, and see if we can\n",
    "learn values for the hash embeddings that match them.\n",
    "\n",
    "The learning procedure will be a simple stochastic gradient descent:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 11,
   "metadata": {
    "colab": {
     "background_save": true
    },
    "executionInfo": {
     "elapsed": 3,
     "status": "aborted",
     "timestamp": 1648199186370,
     "user": {
      "displayName": "Vincent D. Warmerdam",
      "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gh4KYzhhhK0YDTnAQsUIaQPw-0dKIP-kLBID7nFdQ=s64",
      "userId": "05641618555626735638"
     },
     "user_tz": -60
    },
    "id": "ET4n9AA5y0fX"
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "499 43.47128495286\n"
     ]
    }
   ],
   "source": [
    "import numpy\n",
    "import numpy.random as random\n",
    "import mmh3\n",
    "\n",
    "random.seed(0)\n",
    "nb_epoch = 500\n",
    "learn_rate = 0.001\n",
    "nr_hash_vector = 1000\n",
    "\n",
    "words = [str(i) for i in range(2000)]\n",
    "true_vectors = numpy.random.uniform(-0.1, 0.1, (len(words), 10))\n",
    "hash_vectors = numpy.random.uniform(-0.1, 0.1, (nr_hash_vector, 10))\n",
    "examples = list(zip(words, true_vectors))\n",
    "\n",
    "for epoch in range(nb_epoch):\n",
    "    random.shuffle(examples)\n",
    "    loss=0.\n",
    "    for word, truth in examples:\n",
    "        key1 = mmh3.hash(word, 0) % nr_hash_vector\n",
    "        key2 = mmh3.hash(word, 1) % nr_hash_vector\n",
    "        hash_vector = hash_vectors[key1] + hash_vectors[key2]\n",
    "        diff = hash_vector - truth\n",
    "        hash_vectors[key1] -= learn_rate * diff\n",
    "        hash_vectors[key2] -= learn_rate * diff\n",
    "        loss += (diff**2).sum()\n",
    "print(epoch, loss)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "It's worth taking some time to play with this simulation. You can start by doing\n",
    "some sanity checks:\n",
    "\n",
    "- How does the loss change with `nr_hash_vector`?\n",
    "- If you remove `key2`, does the loss go up?\n",
    "- What happens if you add more hash keys?\n",
    "- What happens as the vocabulary size increases?\n",
    "- What happens when more dimensions are added?\n",
    "- How sensitive are the hash embeddings to the initial conditions? If we change the random seed, do we ever get unlucky?\n",
    "\n",
    "If you play with the simulation for a while, you'll start to get a good feel for\n",
    "the dynamics, and hopefully you'll have a clear idea of why the technique works."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "TuRoY34yQb0v",
    "tags": []
   },
   "source": [
    "## Bonus Section \n",
    "\n",
    "To make it easier for folks to try out a whole bunch of settings we'd added a little bit of code below that makes it easier to get relevant visuals."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 16,
   "metadata": {
    "colab": {
     "base_uri": "https://localhost:8080/"
    },
    "executionInfo": {
     "elapsed": 2919,
     "status": "ok",
     "timestamp": 1648200042349,
     "user": {
      "displayName": "Vincent D. Warmerdam",
      "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gh4KYzhhhK0YDTnAQsUIaQPw-0dKIP-kLBID7nFdQ=s64",
      "userId": "05641618555626735638"
     },
     "user_tz": -60
    },
    "id": "NPVKX_pbXJYs",
    "outputId": "fc046666-d690-426d-b8a7-dc557f12832d",
    "tags": []
   },
   "outputs": [],
   "source": [
    "%pip install altair pandas"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 14,
   "metadata": {
    "colab": {
     "background_save": true
    },
    "id": "nHd1wo6m1q-J"
   },
   "outputs": [],
   "source": [
    "from functools import reduce \n",
    "\n",
    "\n",
    "def calc_losses(epochs=500, seed=0, learn_rate=0.001, nr_hash_vector=1000, n_hash=3, n_words=1000, size_vector=10):\n",
    "    random.seed(seed)\n",
    "    nb_epoch = epochs\n",
    "    learn_rate = learn_rate\n",
    "    nr_hash_vector = nr_hash_vector\n",
    "\n",
    "    words = [str(i) for i in range(n_words)]\n",
    "    true_vectors = numpy.random.uniform(-0.1, 0.1, (len(words), size_vector))\n",
    "    hash_vectors = numpy.random.uniform(-0.1, 0.1, (nr_hash_vector, size_vector))\n",
    "    examples = list(zip(words, true_vectors))\n",
    "\n",
    "    losses = []\n",
    "    for epoch in range(nb_epoch):\n",
    "        random.shuffle(examples)\n",
    "        loss=0.\n",
    "        for word, truth in examples:\n",
    "            keys = [mmh3.hash(word, k) % nr_hash_vector for k in range(n_hash)]\n",
    "            hash_vector = reduce(lambda a, b: a + b, [hash_vectors[k] for k in keys])\n",
    "            diff = hash_vector - truth\n",
    "            for key in keys:\n",
    "                hash_vectors[key] -= learn_rate * diff\n",
    "            loss += (diff**2).sum()\n",
    "        losses.append(loss)\n",
    "    return losses\n",
    "\n",
    "data = []\n",
    "for n_hash in [1, 2, 3, 4, 5]:\n",
    "    losses = calc_losses(nr_hash_vector=2_000, n_words=10_000, n_hash=n_hash, epochs=150)\n",
    "    data = data + [{\"loss\": l, \"nr_hash_vector\": nr_hash_vector, \"n_hash\": str(n_hash), \"epoch\": e} for e, l in enumerate(losses)]"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 15,
   "metadata": {
    "colab": {
     "background_save": true
    },
    "id": "P0Q0k9bjXMm3"
   },
   "outputs": [
    {
     "data": {
      "text/html": [
       "\n",
       "<div id=\"altair-viz-65d0fb458f5d489699b36239f07dc49b\"></div>\n",
       "<script type=\"text/javascript\">\n",
       "  var VEGA_DEBUG = (typeof VEGA_DEBUG == \"undefined\") ? {} : VEGA_DEBUG;\n",
       "  (function(spec, embedOpt){\n",
       "    let outputDiv = document.currentScript.previousElementSibling;\n",
       "    if (outputDiv.id !== \"altair-viz-65d0fb458f5d489699b36239f07dc49b\") {\n",
       "      outputDiv = document.getElementById(\"altair-viz-65d0fb458f5d489699b36239f07dc49b\");\n",
       "    }\n",
       "    const paths = {\n",
       "      \"vega\": \"https://cdn.jsdelivr.net/npm//vega@5?noext\",\n",
       "      \"vega-lib\": \"https://cdn.jsdelivr.net/npm//vega-lib?noext\",\n",
       "      \"vega-lite\": \"https://cdn.jsdelivr.net/npm//vega-lite@4.17.0?noext\",\n",
       "      \"vega-embed\": \"https://cdn.jsdelivr.net/npm//vega-embed@6?noext\",\n",
       "    };\n",
       "\n",
       "    function maybeLoadScript(lib, version) {\n",
       "      var key = `${lib.replace(\"-\", \"\")}_version`;\n",
       "      return (VEGA_DEBUG[key] == version) ?\n",
       "        Promise.resolve(paths[lib]) :\n",
       "        new Promise(function(resolve, reject) {\n",
       "          var s = document.createElement('script');\n",
       "          document.getElementsByTagName(\"head\")[0].appendChild(s);\n",
       "          s.async = true;\n",
       "          s.onload = () => {\n",
       "            VEGA_DEBUG[key] = version;\n",
       "            return resolve(paths[lib]);\n",
       "          };\n",
       "          s.onerror = () => reject(`Error loading script: ${paths[lib]}`);\n",
       "          s.src = paths[lib];\n",
       "        });\n",
       "    }\n",
       "\n",
       "    function showError(err) {\n",
       "      outputDiv.innerHTML = `<div class=\"error\" style=\"color:red;\">${err}</div>`;\n",
       "      throw err;\n",
       "    }\n",
       "\n",
       "    function displayChart(vegaEmbed) {\n",
       "      vegaEmbed(outputDiv, spec, embedOpt)\n",
       "        .catch(err => showError(`Javascript Error: ${err.message}<br>This usually means there's a typo in your chart specification. See the javascript console for the full traceback.`));\n",
       "    }\n",
       "\n",
       "    if(typeof define === \"function\" && define.amd) {\n",
       "      requirejs.config({paths});\n",
       "      require([\"vega-embed\"], displayChart, err => showError(`Error loading script: ${err.message}`));\n",
       "    } else {\n",
       "      maybeLoadScript(\"vega\", \"5\")\n",
       "        .then(() => maybeLoadScript(\"vega-lite\", \"4.17.0\"))\n",
       "        .then(() => maybeLoadScript(\"vega-embed\", \"6\"))\n",
       "        .catch(showError)\n",
       "        .then(() => displayChart(vegaEmbed));\n",
       "    }\n",
       "  })({\"config\": {\"view\": {\"continuousWidth\": 400, \"continuousHeight\": 300}}, \"data\": {\"name\": \"data-ca27f03c8630fac574892f65caf383bc\"}, \"mark\": \"line\", \"encoding\": {\"color\": {\"field\": \"n_hash\", \"type\": \"nominal\"}, \"x\": {\"field\": \"epoch\", \"type\": \"quantitative\"}, \"y\": {\"field\": \"loss\", \"type\": \"quantitative\"}}, \"height\": 250, \"selection\": {\"selector001\": {\"type\": \"interval\", \"bind\": \"scales\", \"encodings\": [\"x\", \"y\"]}}, \"width\": 600, \"$schema\": \"https://vega.github.io/schema/vega-lite/v4.17.0.json\", \"datasets\": {\"data-ca27f03c8630fac574892f65caf383bc\": [{\"loss\": 668.3499506289743, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 0}, {\"loss\": 663.6874727511065, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 1}, {\"loss\": 659.0873838836017, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 2}, {\"loss\": 654.5487444192714, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 3}, {\"loss\": 650.0705679181584, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 4}, {\"loss\": 645.6520017126369, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 5}, {\"loss\": 641.2921765223376, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 6}, {\"loss\": 636.990224306468, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 7}, {\"loss\": 632.7452777976187, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 8}, {\"loss\": 628.5564843744703, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 9}, {\"loss\": 624.4230170910583, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 10}, {\"loss\": 620.3440260293443, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 11}, {\"loss\": 616.3187208129315, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 12}, {\"loss\": 612.3463461864325, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 13}, {\"loss\": 608.4261526877073, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 14}, {\"loss\": 604.5573040427984, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 15}, {\"loss\": 600.7390316874059, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 16}, {\"loss\": 596.9706390541346, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 17}, {\"loss\": 593.2513951743719, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 18}, {\"loss\": 589.5805711434293, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 19}, {\"loss\": 585.9574235243194, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 20}, {\"loss\": 582.3813120104709, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 21}, {\"loss\": 578.851538960067, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 22}, {\"loss\": 575.3674076727202, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 23}, {\"loss\": 571.928296458518, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 24}, {\"loss\": 568.5335329039646, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 25}, {\"loss\": 565.1824743454237, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 26}, {\"loss\": 561.8745134963704, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 27}, {\"loss\": 558.6089947818703, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 28}, {\"loss\": 555.3853156152278, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 29}, {\"loss\": 552.2029181024246, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 30}, {\"loss\": 549.0611641131138, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 31}, {\"loss\": 545.9594666995047, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 32}, {\"loss\": 542.8972956232864, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 33}, {\"loss\": 539.8740700356487, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 34}, {\"loss\": 536.8891954203904, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 35}, {\"loss\": 533.9421764256891, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 36}, {\"loss\": 531.0324958926681, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 37}, {\"loss\": 528.1595651047029, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 38}, {\"loss\": 525.3228566397228, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 39}, {\"loss\": 522.5218834709727, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 40}, {\"loss\": 519.7561195636591, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 41}, {\"loss\": 517.0251021819004, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 42}, {\"loss\": 514.3282892971879, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 43}, {\"loss\": 511.665240597295, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 44}, {\"loss\": 509.03549570488747, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 45}, {\"loss\": 506.4385666991664, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 46}, {\"loss\": 503.8739646819671, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 47}, {\"loss\": 501.34123163086167, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 48}, {\"loss\": 498.8399598796967, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 49}, {\"loss\": 496.3697527254468, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 50}, {\"loss\": 493.9301395196424, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 51}, {\"loss\": 491.5206734653956, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 52}, {\"loss\": 489.1409391354956, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 53}, {\"loss\": 486.7905225324951, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 54}, {\"loss\": 484.46902538476724, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 55}, {\"loss\": 482.1760379647556, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 56}, {\"loss\": 479.9111561599257, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 57}, {\"loss\": 477.6740219613683, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 58}, {\"loss\": 475.4642627903134, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 59}, {\"loss\": 473.28147190842975, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 60}, {\"loss\": 471.1252954645938, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 61}, {\"loss\": 468.99535202077647, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 62}, {\"loss\": 466.89128850327734, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 63}, {\"loss\": 464.8127680865716, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 64}, {\"loss\": 462.75943693864286, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 65}, {\"loss\": 460.7309589267515, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 66}, {\"loss\": 458.72696100841046, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 67}, {\"loss\": 456.74713161091967, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 68}, {\"loss\": 454.79113166314795, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 69}, {\"loss\": 452.8586506689025, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 70}, {\"loss\": 450.9493589760761, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 71}, {\"loss\": 449.0629502018819, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 72}, {\"loss\": 447.1991153161835, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 73}, {\"loss\": 445.35754382112304, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 74}, {\"loss\": 443.5379298740077, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 75}, {\"loss\": 441.7399863257389, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 76}, {\"loss\": 439.96343609367733, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 77}, {\"loss\": 438.20796156554917, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 78}, {\"loss\": 436.47331746286625, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 79}, {\"loss\": 434.75921328961965, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 80}, {\"loss\": 433.0653702359491, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 81}, {\"loss\": 431.39153680121086, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 82}, {\"loss\": 429.73740599568123, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 83}, {\"loss\": 428.1027504625327, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 84}, {\"loss\": 426.4872925183241, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 85}, {\"loss\": 424.8907825332398, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 86}, {\"loss\": 423.31296689598065, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 87}, {\"loss\": 421.7536181928929, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 88}, {\"loss\": 420.21247891457836, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 89}, {\"loss\": 418.6892969406071, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 90}, {\"loss\": 417.1838457848242, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 91}, {\"loss\": 415.6958938134646, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 92}, {\"loss\": 414.2252214386121, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 93}, {\"loss\": 412.77159449821096, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 94}, {\"loss\": 411.334768354856, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 95}, {\"loss\": 409.9145424150145, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 96}, {\"loss\": 408.51072715030006, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 97}, {\"loss\": 407.12307836787215, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 98}, {\"loss\": 405.7513651188572, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 99}, {\"loss\": 404.39541277288953, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 100}, {\"loss\": 403.0550060935113, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 101}, {\"loss\": 401.7299434845996, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 102}, {\"loss\": 400.42004342991714, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 103}, {\"loss\": 399.12508410124804, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 104}, {\"loss\": 397.8448843925245, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 105}, {\"loss\": 396.5792407556909, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 106}, {\"loss\": 395.327984005946, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 107}, {\"loss\": 394.09093982284986, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 108}, {\"loss\": 392.8679076705409, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 109}, {\"loss\": 391.6587050185882, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 110}, {\"loss\": 390.46317637211877, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 111}, {\"loss\": 389.2811131477707, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 112}, {\"loss\": 388.11236837934996, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 113}, {\"loss\": 386.9567691947869, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 114}, {\"loss\": 385.81413685724897, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 115}, {\"loss\": 384.68431859166844, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 116}, {\"loss\": 383.56715896524327, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 117}, {\"loss\": 382.4624823132795, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 118}, {\"loss\": 381.37012807312243, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 119}, {\"loss\": 380.2899614278496, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 120}, {\"loss\": 379.2218081584793, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 121}, {\"loss\": 378.1655267847331, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 122}, {\"loss\": 377.1209404895942, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 123}, {\"loss\": 376.0879324772341, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 124}, {\"loss\": 375.0663389000902, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 125}, {\"loss\": 374.0560235929377, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 126}, {\"loss\": 373.0568583180654, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 127}, {\"loss\": 372.068692402873, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 128}, {\"loss\": 371.0913981855343, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 129}, {\"loss\": 370.1248458635677, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 130}, {\"loss\": 369.1688859359097, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 131}, {\"loss\": 368.2234038709692, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 132}, {\"loss\": 367.28825957423345, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 133}, {\"loss\": 366.36330602482224, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 134}, {\"loss\": 365.44843114595477, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 135}, {\"loss\": 364.5435124591558, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 136}, {\"loss\": 363.6484317780454, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 137}, {\"loss\": 362.76306349215037, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 138}, {\"loss\": 361.8872863166009, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 139}, {\"loss\": 361.0209851501999, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 140}, {\"loss\": 360.16404775648255, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 141}, {\"loss\": 359.31634896753656, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 142}, {\"loss\": 358.4777802900677, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 143}, {\"loss\": 357.6482205394947, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 144}, {\"loss\": 356.8275661661273, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 145}, {\"loss\": 356.01573303461635, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 146}, {\"loss\": 355.2125879799334, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 147}, {\"loss\": 354.41802044162586, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 148}, {\"loss\": 353.63193778056416, \"nr_hash_vector\": 1000, \"n_hash\": \"1\", \"epoch\": 149}, {\"loss\": 998.2363716943039, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 0}, {\"loss\": 981.2348028838925, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 1}, {\"loss\": 964.6840425728037, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 2}, {\"loss\": 948.5712832020793, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 3}, {\"loss\": 932.8831459088995, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 4}, {\"loss\": 917.6075550329675, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 5}, {\"loss\": 902.7329269104192, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 6}, {\"loss\": 888.2476455272928, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 7}, {\"loss\": 874.1404895568755, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 8}, {\"loss\": 860.4004901514966, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 9}, {\"loss\": 847.0171753364675, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 10}, {\"loss\": 833.9803716816258, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 11}, {\"loss\": 821.2799658588449, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 12}, {\"loss\": 808.9066759684524, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 13}, {\"loss\": 796.8511654833394, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 14}, {\"loss\": 785.1043846778916, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 15}, {\"loss\": 773.6574597329717, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 16}, {\"loss\": 762.5021367761733, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 17}, {\"loss\": 751.6303056929374, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 18}, {\"loss\": 741.0336554485776, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 19}, {\"loss\": 730.7046168448155, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 20}, {\"loss\": 720.6357612316746, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 21}, {\"loss\": 710.8196919316028, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 22}, {\"loss\": 701.2493925420382, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 23}, {\"loss\": 691.9181727027126, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 24}, {\"loss\": 682.8192643680873, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 25}, {\"loss\": 673.9463478602195, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 26}, {\"loss\": 665.2931206453445, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 27}, {\"loss\": 656.8534178336965, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 28}, {\"loss\": 648.6215377649871, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 29}, {\"loss\": 640.5918346309946, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 30}, {\"loss\": 632.7586709437609, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 31}, {\"loss\": 625.1166662545132, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 32}, {\"loss\": 617.6606557701815, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 33}, {\"loss\": 610.3856855373563, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 34}, {\"loss\": 603.2867149042905, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 35}, {\"loss\": 596.3589892744983, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 36}, {\"loss\": 589.597961182303, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 37}, {\"loss\": 582.9991447227576, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 38}, {\"loss\": 576.5580010602216, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 39}, {\"loss\": 570.2704422516455, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 40}, {\"loss\": 564.1323141340141, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 41}, {\"loss\": 558.1396896810484, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 42}, {\"loss\": 552.2885861218233, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 43}, {\"loss\": 546.5752847510458, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 44}, {\"loss\": 540.9963270964399, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 45}, {\"loss\": 535.5480503334588, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 46}, {\"loss\": 530.2269558340569, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 47}, {\"loss\": 525.02962761536, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 48}, {\"loss\": 519.9529682536997, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 49}, {\"loss\": 514.9938537638034, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 50}, {\"loss\": 510.1491826973853, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 51}, {\"loss\": 505.4159117190565, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 52}, {\"loss\": 500.7911793343251, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 53}, {\"loss\": 496.2721421138942, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 54}, {\"loss\": 491.85606825009194, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 55}, {\"loss\": 487.54025833954006, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 56}, {\"loss\": 483.3221938085105, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 57}, {\"loss\": 479.19945525136677, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 58}, {\"loss\": 475.1694970583023, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 59}, {\"loss\": 471.2299522091879, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 60}, {\"loss\": 467.3786138206197, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 61}, {\"loss\": 463.6132042443144, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 62}, {\"loss\": 459.9315097683442, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 63}, {\"loss\": 456.3314619383465, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 64}, {\"loss\": 452.8109398396176, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 65}, {\"loss\": 449.367958426805, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 66}, {\"loss\": 446.00056965164, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 67}, {\"loss\": 442.7069497152162, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 68}, {\"loss\": 439.48519375574904, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 69}, {\"loss\": 436.3335325861198, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 70}, {\"loss\": 433.25020523558703, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 71}, {\"loss\": 430.23356712036315, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 72}, {\"loss\": 427.28193849381563, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 73}, {\"loss\": 424.3937674637093, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 74}, {\"loss\": 421.56741206155556, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 75}, {\"loss\": 418.80143993207906, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 76}, {\"loss\": 416.0943095659082, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 77}, {\"loss\": 413.44466310664205, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 78}, {\"loss\": 410.85107282314567, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 79}, {\"loss\": 408.3121409487431, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 80}, {\"loss\": 405.8266034954931, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 81}, {\"loss\": 403.39318841455105, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 82}, {\"loss\": 401.01051039015437, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 83}, {\"loss\": 398.67744525544606, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 84}, {\"loss\": 396.3928548103477, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 85}, {\"loss\": 394.1555808973035, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 86}, {\"loss\": 391.96442517942876, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 87}, {\"loss\": 389.81833996495465, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 88}, {\"loss\": 387.71625918356176, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 89}, {\"loss\": 385.65712844730837, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 90}, {\"loss\": 383.63991296548073, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 91}, {\"loss\": 381.66366365654255, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 92}, {\"loss\": 379.72741748094506, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 93}, {\"loss\": 377.8302755325134, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 94}, {\"loss\": 375.97125276542545, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 95}, {\"loss\": 374.1494427652365, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 96}, {\"loss\": 372.3641217219305, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 97}, {\"loss\": 370.61433330350854, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 98}, {\"loss\": 368.899240208379, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 99}, {\"loss\": 367.2181460598986, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 100}, {\"loss\": 365.57024342967156, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 101}, {\"loss\": 363.9547492043009, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 102}, {\"loss\": 362.370970043808, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 103}, {\"loss\": 360.8181530248504, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 104}, {\"loss\": 359.2956069418128, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 105}, {\"loss\": 357.80260719715426, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 106}, {\"loss\": 356.3385412267502, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 107}, {\"loss\": 354.90274820403397, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 108}, {\"loss\": 353.49456874843804, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 109}, {\"loss\": 352.11340831039826, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 110}, {\"loss\": 350.758688341413, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 111}, {\"loss\": 349.4297621001846, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 112}, {\"loss\": 348.12613032841534, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 113}, {\"loss\": 346.8472122421498, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 114}, {\"loss\": 345.5924151561022, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 115}, {\"loss\": 344.3612484546453, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 116}, {\"loss\": 343.153179467925, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 117}, {\"loss\": 341.9677492751778, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 118}, {\"loss\": 340.8043888962497, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 119}, {\"loss\": 339.6626851668394, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 120}, {\"loss\": 338.54216007913976, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 121}, {\"loss\": 337.44230071723916, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 122}, {\"loss\": 336.3627013386345, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 123}, {\"loss\": 335.3029147771128, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 124}, {\"loss\": 334.2625373515403, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 125}, {\"loss\": 333.2411580429622, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 126}, {\"loss\": 332.23833979526586, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 127}, {\"loss\": 331.25370725553137, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 128}, {\"loss\": 330.28691691072555, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 129}, {\"loss\": 329.3375740312559, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 130}, {\"loss\": 328.4052624047007, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 131}, {\"loss\": 327.48970474716185, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 132}, {\"loss\": 326.5904797570045, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 133}, {\"loss\": 325.7072260329763, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 134}, {\"loss\": 324.8396590161787, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 135}, {\"loss\": 323.9875179998549, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 136}, {\"loss\": 323.1504011489792, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 137}, {\"loss\": 322.3280113753799, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 138}, {\"loss\": 321.5200146907065, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 139}, {\"loss\": 320.72614183883974, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 140}, {\"loss\": 319.9461211298803, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 141}, {\"loss\": 319.1796515974871, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 142}, {\"loss\": 318.4264689224111, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 143}, {\"loss\": 317.68631289785856, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 144}, {\"loss\": 316.95894231159895, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 145}, {\"loss\": 316.244033443146, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 146}, {\"loss\": 315.54137037414534, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 147}, {\"loss\": 314.85070498700173, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 148}, {\"loss\": 314.17178406528035, \"nr_hash_vector\": 1000, \"n_hash\": \"2\", \"epoch\": 149}, {\"loss\": 1329.2289578982193, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 0}, {\"loss\": 1292.1076176318372, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 1}, {\"loss\": 1256.4681332862815, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 2}, {\"loss\": 1222.2465008143079, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 3}, {\"loss\": 1189.3797877067782, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 4}, {\"loss\": 1157.8090784105366, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 5}, {\"loss\": 1127.478559545936, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 6}, {\"loss\": 1098.3344345015091, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 7}, {\"loss\": 1070.3255482660595, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 8}, {\"loss\": 1043.402744081533, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 9}, {\"loss\": 1017.5198724364101, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 10}, {\"loss\": 992.6327440702701, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 11}, {\"loss\": 968.6981618252028, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 12}, {\"loss\": 945.6764583632215, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 13}, {\"loss\": 923.5289842433539, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 14}, {\"loss\": 902.218937746806, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 15}, {\"loss\": 881.710624118729, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 16}, {\"loss\": 861.9712209196945, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 17}, {\"loss\": 842.9687931334297, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 18}, {\"loss\": 824.6722864179819, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 19}, {\"loss\": 807.0528313511619, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 20}, {\"loss\": 790.082234185243, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 21}, {\"loss\": 773.7336868252821, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 22}, {\"loss\": 757.9818534506509, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 23}, {\"loss\": 742.8028474063465, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 24}, {\"loss\": 728.1727649479838, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 25}, {\"loss\": 714.0696900524574, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 26}, {\"loss\": 700.4726872262772, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 27}, {\"loss\": 687.3610224951949, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 28}, {\"loss\": 674.7152717538272, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 29}, {\"loss\": 662.517121778477, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 30}, {\"loss\": 650.7485031921761, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 31}, {\"loss\": 639.3921099008113, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 32}, {\"loss\": 628.4321358236261, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 33}, {\"loss\": 617.8530702912719, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 34}, {\"loss\": 607.6396877132464, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 35}, {\"loss\": 597.7777117444709, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 36}, {\"loss\": 588.2536231052643, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 37}, {\"loss\": 579.0545904144406, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 38}, {\"loss\": 570.1676632770653, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 39}, {\"loss\": 561.5809265025551, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 40}, {\"loss\": 553.2829005445907, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 41}, {\"loss\": 545.262505776503, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 42}, {\"loss\": 537.5092105212002, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 43}, {\"loss\": 530.0128226614049, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 44}, {\"loss\": 522.7640272087833, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 45}, {\"loss\": 515.7534115080948, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 46}, {\"loss\": 508.971862084269, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 47}, {\"loss\": 502.4108796365528, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 48}, {\"loss\": 496.06244376817244, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 49}, {\"loss\": 489.9185239337196, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 50}, {\"loss\": 483.97175630825745, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 51}, {\"loss\": 478.2146375293066, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 52}, {\"loss\": 472.6404718372188, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 53}, {\"loss\": 467.2425625500013, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 54}, {\"loss\": 462.0143636488076, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 55}, {\"loss\": 456.9499299120558, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 56}, {\"loss\": 452.0432922204611, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 57}, {\"loss\": 447.28876380832367, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 58}, {\"loss\": 442.6809649338206, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 59}, {\"loss\": 438.21461139496165, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 60}, {\"loss\": 433.8848021368534, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 61}, {\"loss\": 429.68690815929864, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 62}, {\"loss\": 425.61621420574323, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 63}, {\"loss\": 421.66813440605205, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 64}, {\"loss\": 417.8383818480746, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 65}, {\"loss\": 414.1228095450864, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 66}, {\"loss\": 410.51757305820206, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 67}, {\"loss\": 407.01900181831934, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 68}, {\"loss\": 403.62331323939145, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 69}, {\"loss\": 400.32704768050627, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 70}, {\"loss\": 397.1267741010934, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 71}, {\"loss\": 394.01927214244773, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 72}, {\"loss\": 391.0013770267514, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 73}, {\"loss\": 388.07026943059196, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 74}, {\"loss\": 385.2228176456061, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 75}, {\"loss\": 382.4564739264325, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 76}, {\"loss\": 379.76832913848625, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 77}, {\"loss\": 377.155928109414, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 78}, {\"loss\": 374.6169282085191, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 79}, {\"loss\": 372.1486155889483, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 80}, {\"loss\": 369.74895768699616, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 81}, {\"loss\": 367.4156190262137, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 82}, {\"loss\": 365.1463699737997, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 83}, {\"loss\": 362.9392479237883, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 84}, {\"loss\": 360.79231526060244, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 85}, {\"loss\": 358.7036999561098, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 86}, {\"loss\": 356.67140576052157, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 87}, {\"loss\": 354.6937288651228, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 88}, {\"loss\": 352.76903873971844, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 89}, {\"loss\": 350.8955770012807, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 90}, {\"loss\": 349.07166761524525, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 91}, {\"loss\": 347.29583940281503, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 92}, {\"loss\": 345.56665494816104, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 93}, {\"loss\": 343.8827486089105, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 94}, {\"loss\": 342.24247582140606, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 95}, {\"loss\": 340.6447295597916, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 96}, {\"loss\": 339.0882308827238, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 97}, {\"loss\": 337.57167977357005, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 98}, {\"loss\": 336.093796910567, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 99}, {\"loss\": 334.6535201867393, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 100}, {\"loss\": 333.2497684342612, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 101}, {\"loss\": 331.8813318999642, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 102}, {\"loss\": 330.547230410861, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 103}, {\"loss\": 329.2464625227024, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 104}, {\"loss\": 327.978042596724, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 105}, {\"loss\": 326.74096813576966, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 106}, {\"loss\": 325.5343891292682, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 107}, {\"loss\": 324.35738101031114, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 108}, {\"loss\": 323.2091575875864, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 109}, {\"loss\": 322.0888956995324, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 110}, {\"loss\": 320.99574468508706, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 111}, {\"loss\": 319.9289799541033, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 112}, {\"loss\": 318.8878254855954, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 113}, {\"loss\": 317.8715088272611, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 114}, {\"loss\": 316.8793829898264, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 115}, {\"loss\": 315.91078797896944, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 116}, {\"loss\": 314.9650610668542, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 117}, {\"loss\": 314.04158023897986, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 118}, {\"loss\": 313.13969120167224, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 119}, {\"loss\": 312.2588236357879, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 120}, {\"loss\": 311.39839221876366, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 121}, {\"loss\": 310.55784295158503, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 122}, {\"loss\": 309.7366132367438, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 123}, {\"loss\": 308.9342031544164, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 124}, {\"loss\": 308.150123980626, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 125}, {\"loss\": 307.383852189682, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 126}, {\"loss\": 306.63498215261245, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 127}, {\"loss\": 305.9029278550399, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 128}, {\"loss\": 305.18743296393654, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 129}, {\"loss\": 304.48789317964275, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 130}, {\"loss\": 303.8039699508138, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 131}, {\"loss\": 303.13521516188587, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 132}, {\"loss\": 302.4812679330661, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 133}, {\"loss\": 301.8416717377276, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 134}, {\"loss\": 301.21611348521776, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 135}, {\"loss\": 300.6043069321269, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 136}, {\"loss\": 300.00578773235674, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 137}, {\"loss\": 299.4202438972696, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 138}, {\"loss\": 298.8473588239938, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 139}, {\"loss\": 298.28673320132924, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 140}, {\"loss\": 297.7381466009439, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 141}, {\"loss\": 297.20128553227494, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 142}, {\"loss\": 296.67584979164883, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 143}, {\"loss\": 296.1615438868589, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 144}, {\"loss\": 295.6581631142585, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 145}, {\"loss\": 295.165346603466, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 146}, {\"loss\": 294.6829010915707, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 147}, {\"loss\": 294.21050999248376, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 148}, {\"loss\": 293.7479524767576, \"nr_hash_vector\": 1000, \"n_hash\": \"3\", \"epoch\": 149}, {\"loss\": 1656.2483845398106, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 0}, {\"loss\": 1591.896211475941, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 1}, {\"loss\": 1530.9580055007075, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 2}, {\"loss\": 1473.234378420767, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 3}, {\"loss\": 1418.5357805280885, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 4}, {\"loss\": 1366.6849036823621, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 5}, {\"loss\": 1317.5193947709909, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 6}, {\"loss\": 1270.8837511485249, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 7}, {\"loss\": 1226.6336856811017, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 8}, {\"loss\": 1184.633399296606, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 9}, {\"loss\": 1144.7563264162634, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 10}, {\"loss\": 1106.8822994848556, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 11}, {\"loss\": 1070.8995676390564, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 12}, {\"loss\": 1036.7040074728511, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 13}, {\"loss\": 1004.1960799882318, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 14}, {\"loss\": 973.2830073585309, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 15}, {\"loss\": 943.8769485475098, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 16}, {\"loss\": 915.8969076008162, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 17}, {\"loss\": 889.2660529335742, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 18}, {\"loss\": 863.9113810418363, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 19}, {\"loss\": 839.7642720815708, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 20}, {\"loss\": 816.760161431749, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 21}, {\"loss\": 794.8377464116838, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 22}, {\"loss\": 773.9406346533997, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 23}, {\"loss\": 754.0157134604369, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 24}, {\"loss\": 735.0112091332752, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 25}, {\"loss\": 716.8790863171441, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 26}, {\"loss\": 699.5751245077563, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 27}, {\"loss\": 683.0560146995177, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 28}, {\"loss\": 667.2814075236067, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 29}, {\"loss\": 652.214055648895, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 30}, {\"loss\": 637.8177962597483, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 31}, {\"loss\": 624.0584406551274, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 32}, {\"loss\": 610.9042197877847, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 33}, {\"loss\": 598.3250163632462, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 34}, {\"loss\": 586.292111667159, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 35}, {\"loss\": 574.7785864529982, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 36}, {\"loss\": 563.7588121743642, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 37}, {\"loss\": 553.2091993441362, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 38}, {\"loss\": 543.1064588481878, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 39}, {\"loss\": 533.4287053036071, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 40}, {\"loss\": 524.1555807938972, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 41}, {\"loss\": 515.2677131647803, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 42}, {\"loss\": 506.7468701189076, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 43}, {\"loss\": 498.57552193715895, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 44}, {\"loss\": 490.7373987484736, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 45}, {\"loss\": 483.21696396929036, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 46}, {\"loss\": 475.99914820332674, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 47}, {\"loss\": 469.06990571868977, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 48}, {\"loss\": 462.4164205449709, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 49}, {\"loss\": 456.0255204076452, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 50}, {\"loss\": 449.8854904327553, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 51}, {\"loss\": 443.9847581657742, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 52}, {\"loss\": 438.3127615805882, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 53}, {\"loss\": 432.8590525471984, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 54}, {\"loss\": 427.6137846250797, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 55}, {\"loss\": 422.5680886474734, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 56}, {\"loss\": 417.7129121589751, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 57}, {\"loss\": 413.03992338941316, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 58}, {\"loss\": 408.54136150608593, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 59}, {\"loss\": 404.20947708778573, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 60}, {\"loss\": 400.0371087939617, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 61}, {\"loss\": 396.0177501266651, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 62}, {\"loss\": 392.1445648199397, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 63}, {\"loss\": 388.4113276911157, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 64}, {\"loss\": 384.81231833395606, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 65}, {\"loss\": 381.34166332685135, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 66}, {\"loss\": 377.99413239330033, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 67}, {\"loss\": 374.76487332232085, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 68}, {\"loss\": 371.64887109955146, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 69}, {\"loss\": 368.64120109329923, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 70}, {\"loss\": 365.7377878416871, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 71}, {\"loss\": 362.93425992443395, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 72}, {\"loss\": 360.2266565647854, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 73}, {\"loss\": 357.6111754255737, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 74}, {\"loss\": 355.0840615129652, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 75}, {\"loss\": 352.6420513428893, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 76}, {\"loss\": 350.2813737829794, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 77}, {\"loss\": 347.99912358729347, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 78}, {\"loss\": 345.79232965009186, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 79}, {\"loss\": 343.6580034352358, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 80}, {\"loss\": 341.5934006224567, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 81}, {\"loss\": 339.59572107687205, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 82}, {\"loss\": 337.66238930445536, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 83}, {\"loss\": 335.79112941124265, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 84}, {\"loss\": 333.9795971427219, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 85}, {\"loss\": 332.2255288879131, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 86}, {\"loss\": 330.526713092642, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 87}, {\"loss\": 328.8811788862902, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 88}, {\"loss\": 327.28718291246435, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 89}, {\"loss\": 325.7425109114798, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 90}, {\"loss\": 324.2453830885704, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 91}, {\"loss\": 322.7942895598716, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 92}, {\"loss\": 321.3874844232805, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 93}, {\"loss\": 320.0233609802206, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 94}, {\"loss\": 318.7003220573422, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 95}, {\"loss\": 317.41720026912265, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 96}, {\"loss\": 316.1724449958547, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 97}, {\"loss\": 314.9646674000122, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 98}, {\"loss\": 313.7924963961101, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 99}, {\"loss\": 312.65487646176314, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 100}, {\"loss\": 311.5506205975089, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 101}, {\"loss\": 310.4784509042621, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 102}, {\"loss\": 309.43744539214526, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 103}, {\"loss\": 308.42637726891303, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 104}, {\"loss\": 307.44439694068495, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 105}, {\"loss\": 306.49034832050415, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 106}, {\"loss\": 305.5634779612918, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 107}, {\"loss\": 304.6627884404285, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 108}, {\"loss\": 303.78743236051315, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 109}, {\"loss\": 302.9366717002721, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 110}, {\"loss\": 302.1095760443421, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 111}, {\"loss\": 301.3054626375134, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 112}, {\"loss\": 300.5235232812974, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 113}, {\"loss\": 299.7630914138595, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 114}, {\"loss\": 299.0234166094896, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 115}, {\"loss\": 298.3040013550107, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 116}, {\"loss\": 297.6041297497102, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 117}, {\"loss\": 296.92309830050726, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 118}, {\"loss\": 296.2603597186119, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 119}, {\"loss\": 295.61532032797055, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 120}, {\"loss\": 294.9874944435697, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 121}, {\"loss\": 294.37631229339587, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 122}, {\"loss\": 293.7812555978418, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 123}, {\"loss\": 293.20186701553985, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 124}, {\"loss\": 292.6376491940367, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 125}, {\"loss\": 292.0880914157865, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 126}, {\"loss\": 291.55291783692013, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 127}, {\"loss\": 291.0314467355339, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 128}, {\"loss\": 290.523656524876, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 129}, {\"loss\": 290.02878097145634, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 130}, {\"loss\": 289.54655214485103, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 131}, {\"loss\": 289.0766117281336, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 132}, {\"loss\": 288.6186043198089, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 133}, {\"loss\": 288.17215007849745, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 134}, {\"loss\": 287.73691311159797, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 135}, {\"loss\": 287.31263225873283, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 136}, {\"loss\": 286.899067612508, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 137}, {\"loss\": 286.4956205123104, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 138}, {\"loss\": 286.10227345649463, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 139}, {\"loss\": 285.7185113001267, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 140}, {\"loss\": 285.3443198455355, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 141}, {\"loss\": 284.97920780532047, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 142}, {\"loss\": 284.62301032386347, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 143}, {\"loss\": 284.27548648291764, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 144}, {\"loss\": 283.9364376585152, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 145}, {\"loss\": 283.6056103118706, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 146}, {\"loss\": 283.2827273751423, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 147}, {\"loss\": 282.9675593856814, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 148}, {\"loss\": 282.659977989786, \"nr_hash_vector\": 1000, \"n_hash\": \"4\", \"epoch\": 149}, {\"loss\": 1983.529537588902, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 0}, {\"loss\": 1884.3690371575383, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 1}, {\"loss\": 1791.800678380372, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 2}, {\"loss\": 1705.3306850442002, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 3}, {\"loss\": 1624.4998646596107, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 4}, {\"loss\": 1548.8930772749882, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 5}, {\"loss\": 1478.134444833768, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 6}, {\"loss\": 1411.8761153315327, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 7}, {\"loss\": 1349.7967984818304, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 8}, {\"loss\": 1291.6014924606495, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 9}, {\"loss\": 1237.0210349386582, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 10}, {\"loss\": 1185.8043974843315, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 11}, {\"loss\": 1137.7189583672614, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 12}, {\"loss\": 1092.5528325464488, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 13}, {\"loss\": 1050.1085726927945, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 14}, {\"loss\": 1010.2021225543253, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 15}, {\"loss\": 972.6632671354367, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 16}, {\"loss\": 937.336342514549, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 17}, {\"loss\": 904.0769842001029, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 18}, {\"loss\": 872.7485389484309, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 19}, {\"loss\": 843.2247466027985, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 20}, {\"loss\": 815.389348781373, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 21}, {\"loss\": 789.1324401530353, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 22}, {\"loss\": 764.3548509508643, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 23}, {\"loss\": 740.9635695573196, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 24}, {\"loss\": 718.8695409045397, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 25}, {\"loss\": 697.9914098447798, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 26}, {\"loss\": 678.2543205474262, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 27}, {\"loss\": 659.5867088773339, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 28}, {\"loss\": 641.9231967733988, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 29}, {\"loss\": 625.2032214577131, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 30}, {\"loss\": 609.3688072641296, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 31}, {\"loss\": 594.366473907617, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 32}, {\"loss\": 580.1463774468156, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 33}, {\"loss\": 566.6623905388254, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 34}, {\"loss\": 553.870331057408, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 35}, {\"loss\": 541.7297148257978, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 36}, {\"loss\": 530.2032519240412, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 37}, {\"loss\": 519.254872357097, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 38}, {\"loss\": 508.85142752281973, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 39}, {\"loss\": 498.9613046233911, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 40}, {\"loss\": 489.5554672743659, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 41}, {\"loss\": 480.60653703637064, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 42}, {\"loss\": 472.08908182845767, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 43}, {\"loss\": 463.9787462644235, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 44}, {\"loss\": 456.25330818057927, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 45}, {\"loss\": 448.8923101301766, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 46}, {\"loss\": 441.87495108028025, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 47}, {\"loss\": 435.1831991186188, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 48}, {\"loss\": 428.79935813066373, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 49}, {\"loss\": 422.7066377378182, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 50}, {\"loss\": 416.8900039311702, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 51}, {\"loss\": 411.33503969882867, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 52}, {\"loss\": 406.02769744001534, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 53}, {\"loss\": 400.9549873893364, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 54}, {\"loss\": 396.10525281310447, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 55}, {\"loss\": 391.4672448485217, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 56}, {\"loss\": 387.029833854223, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 57}, {\"loss\": 382.78295139686, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 58}, {\"loss\": 378.7172185538349, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 59}, {\"loss\": 374.8233889797589, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 60}, {\"loss\": 371.09297961725184, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 61}, {\"loss\": 367.51846163324547, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 62}, {\"loss\": 364.0919022783646, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 63}, {\"loss\": 360.8061260862567, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 64}, {\"loss\": 357.65481541864347, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 65}, {\"loss\": 354.63070675304687, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 66}, {\"loss\": 351.72825965794686, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 67}, {\"loss\": 348.94203612290687, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 68}, {\"loss\": 346.26632831774316, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 69}, {\"loss\": 343.69570665270146, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 70}, {\"loss\": 341.2259674070469, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 71}, {\"loss\": 338.852039366339, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 72}, {\"loss\": 336.5697795818144, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 73}, {\"loss\": 334.37505278962027, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 74}, {\"loss\": 332.26376140976856, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 75}, {\"loss\": 330.232423007496, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 76}, {\"loss\": 328.2774142305023, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 77}, {\"loss\": 326.39526821481445, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 78}, {\"loss\": 324.5831177032742, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 79}, {\"loss\": 322.8377475807972, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 80}, {\"loss\": 321.15642492843, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 81}, {\"loss\": 319.536171856347, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 82}, {\"loss\": 317.9742933284252, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 83}, {\"loss\": 316.4687884592335, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 84}, {\"loss\": 315.01698699182896, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 85}, {\"loss\": 313.61687098591386, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 86}, {\"loss\": 312.26596234366986, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 87}, {\"loss\": 310.9624668295123, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 88}, {\"loss\": 309.7047080257712, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 89}, {\"loss\": 308.49039331721787, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 90}, {\"loss\": 307.3178814617213, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 91}, {\"loss\": 306.1856006388344, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 92}, {\"loss\": 305.09192177354186, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 93}, {\"loss\": 304.0353424437649, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 94}, {\"loss\": 303.0141986515573, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 95}, {\"loss\": 302.02754219777046, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 96}, {\"loss\": 301.07390640906766, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 97}, {\"loss\": 300.15170844467144, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 98}, {\"loss\": 299.2599095521291, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 99}, {\"loss\": 298.39739589664225, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 100}, {\"loss\": 297.56309348376993, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 101}, {\"loss\": 296.7559316095702, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 102}, {\"loss\": 295.9749462228038, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 103}, {\"loss\": 295.21894341085346, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 104}, {\"loss\": 294.4871982513456, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 105}, {\"loss\": 293.7786118035215, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 106}, {\"loss\": 293.09257728152807, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 107}, {\"loss\": 292.42820373011995, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 108}, {\"loss\": 291.78456858778225, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 109}, {\"loss\": 291.1610193448231, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 110}, {\"loss\": 290.5568461570194, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 111}, {\"loss\": 289.9714812800654, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 112}, {\"loss\": 289.4040968135294, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 113}, {\"loss\": 288.8541097859893, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 114}, {\"loss\": 288.3208309239401, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 115}, {\"loss\": 287.8038742652736, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 116}, {\"loss\": 287.30249457584705, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 117}, {\"loss\": 286.8162301438075, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 118}, {\"loss\": 286.3446534431262, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 119}, {\"loss\": 285.88694770543566, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 120}, {\"loss\": 285.44293716548924, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 121}, {\"loss\": 285.01203086726724, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 122}, {\"loss\": 284.5937803652101, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 123}, {\"loss\": 284.18784557786154, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 124}, {\"loss\": 283.7938567626075, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 125}, {\"loss\": 283.4112702643356, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 126}, {\"loss\": 283.03987335505605, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 127}, {\"loss\": 282.6790431823389, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 128}, {\"loss\": 282.32886076938456, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 129}, {\"loss\": 281.9886591180783, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 130}, {\"loss\": 281.65805439615644, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 131}, {\"loss\": 281.33700312642054, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 132}, {\"loss\": 281.02495860138725, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 133}, {\"loss\": 280.721795787517, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 134}, {\"loss\": 280.4271423103239, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 135}, {\"loss\": 280.1407403222957, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 136}, {\"loss\": 279.86241159894655, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 137}, {\"loss\": 279.59180286487145, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 138}, {\"loss\": 279.32870660379507, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 139}, {\"loss\": 279.0727878731866, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 140}, {\"loss\": 278.82395199693843, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 141}, {\"loss\": 278.58200565142505, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 142}, {\"loss\": 278.3466287717034, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 143}, {\"loss\": 278.1176239709726, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 144}, {\"loss\": 277.89497252209065, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 145}, {\"loss\": 277.67835372196413, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 146}, {\"loss\": 277.46754986825033, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 147}, {\"loss\": 277.2625041975386, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 148}, {\"loss\": 277.06277028411233, \"nr_hash_vector\": 1000, \"n_hash\": \"5\", \"epoch\": 149}]}}, {\"mode\": \"vega-lite\"});\n",
       "</script>"
      ],
      "text/plain": [
       "alt.Chart(...)"
      ]
     },
     "execution_count": 15,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "import pandas as pd\n",
    "import altair as alt\n",
    "\n",
    "source = pd.DataFrame(data)\n",
    "\n",
    "(alt.Chart(source)\n",
    "  .mark_line()\n",
    "  .encode(x='epoch', y='loss', color='n_hash')\n",
    "  .properties(width=600, height=250)\n",
    "  .interactive())"
   ]
  }
 ],
 "metadata": {
  "colab": {
   "authorship_tag": "ABX9TyPAXtr/TeMWYmJkxrXcAPIT",
   "collapsed_sections": [],
   "name": "bloom_embeddings.ipynb",
   "version": ""
  },
  "kernelspec": {
   "display_name": "Python 3 (ipykernel)",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.8.6"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 4
}
