{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "f92-4Hjy7kA8"
   },
   "source": [
    "<a href=\"https://www.arduino.cc/\"><img src=\"https://raw.githubusercontent.com/sandeepmistry/aimldevfest-workshop-2019/master/images/Arduino_logo_R_highquality.png\" width=200/></a>\n",
    "# Tiny ML on Arduino\n",
    "## Classify objects by color tutorial\n",
    "\n",
    " \n",
    "https://github.com/arduino/ArduinoTensorFlowLiteTutorials/"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "uvDA8AK7QOq-"
   },
   "source": [
    "## Setup Python Environment \n",
    "\n",
    "The next cell sets up the dependencies in required for the notebook, run it."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 1,
   "metadata": {
    "id": "Y2gs-PL4xDkZ"
   },
   "outputs": [],
   "source": [
    "# Setup environment\n",
    "#!apt-get -qq install xxd\n",
    "#!pip install pandas numpy matplotlib\n",
    "#%tensorflow_version 2.x\n",
    "#!pip install tensorflow"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "9lwkeshJk7dg"
   },
   "source": [
    "# Upload Data\n",
    "\n",
    "1. Open the panel on the left side of Colab by clicking on the __>__\n",
    "1. Select the Files tab\n",
    "1. Drag `csv` files from your computer to the tab to upload them into colab."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "kSxUeYPNQbOg"
   },
   "source": [
    "# Train Neural Network\n",
    "\n",
    "\n",
    "\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "Gxk414PU3oy3"
   },
   "source": [
    "## Parse and prepare the data\n",
    "\n",
    "The next cell parses the csv files and transforms them to a format that will be used to train the full connected neural network.\n",
    "\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "metadata": {
    "id": "AGChd1FAk5_j"
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "TensorFlow version = 2.3.1\n",
      "\n",
      "\u001b[32;4mred\u001b[0m class will be output \u001b[32m0\u001b[0m of the classifier\n",
      "148 samples captured for training with inputs ['Red', 'Green', 'Blue'] \n",
      "\n"
     ]
    },
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAlMAAABZCAYAAAAaRaGnAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/Il7ecAAAACXBIWXMAAAsTAAALEwEAmpwYAAAMJ0lEQVR4nO3dbawcZ3XA8f+J35IYlTgvTR3b4prGapQiWhKLBrWqEPQlSVFcqa0UhARpqfyFCFohVUkjUZVvqFVpkWiQFdIEhBLUlLYuCqVpQOJLk8amYBKH1Jfg1teYxHkhDXmx4/j0w0zwc5+7u3dzZ333evf/k67uzDyzs8+cOTN7dmdnJzITSZIkLc1Z4+6AJEnSmcxiSpIkqQOLKUmSpA4spiRJkjqwmJIkSerAYkqSJKmDRYupiLg9Ip6MiIf7tEdEfCoiZiNiX0RcMfpuSpIkrUzDfDJ1B3D1gPZrgG3t307g1u7dkiRJOjMsWkxl5jeAZwbMsgP4XDYeAM6LiI2j6qAkSdJKtnoEy9gEHCrG59ppR+oZI2InzadXrF+//srLLrtsBE9feGGJjzvxytIed3LNgMbj1fja4dqOV22r1/J6Ha9K5HlLqMvnYTNgUIzWVG2vFgs9Wf3C/smTxTKrtldeLYar+n1N0bZuwEqcVbS9dE4134Zi+S9Wyz+X0+nYq/3b1q1b4kLrx50oc2dtz8GF81XOOtG/rXzY6ipxTjKcY9X4vPyuG/s3HVuw8oU+TbGqmlCkytqzqhwu87beRwaEb976DFidBY1rTnX62IBdbcE6FNbW23neblJ1et627D/b/PUZsEJrqqC/1H9W6t2ydFaf4Xo8qiAdK7dXFYjymFq3HeszUq/PAOX2qh9VLr7edvNeEepjdtkY1UKzWJ+TS1vXekuuK17Khl2f+rh1vDh0DM7FSnnsWJB/Ax43KFdq6xdpH9LevXufysyLerWNopgaWmbuAnYBbN++Pffs2TPaJ/iPJT7uqR8s7XEvXzKgca4a3zxc21zVtmEzr9dclTjzllAfyC4YcqGDYnRJVTc/Xyz0peqg9+Nib3n25fltR54/NfyDu6rnePbU8JuqwmdVkdtnv+HU8P7Lq47+XrH8vdXyr+R0mn26f9ul25a40Eur8ScOFiMzPQcBOFrnX+GcJ/q3lSmw4eL5bS8MKMJK/1ONnz9zajgP9H/c7PxXlNkFK18spk/Tmp+aP7662E82n13l98vFEf6iat0O9n1quHjm1PDsgPnqdd14KgkOLHgbekq9DqWZmWpC+WIXh+a3HSriWa7fwWoZ5focGLB9NlZJvL//rNS7ZanYfRccq8r1WVttr8fL7VUdMw8ePDVcrg/M30blNqnXZ4Bye22rCp8DRY1Xb7uyJwuO2eUq1IXJK8W2fHnL/LYh1/VA9T52W3Euadj12VaF6GBx6BiYi7WyQjtYtVWHmXnK/Fis9n3HIu1Dioj6CPYTo7ia7zBQbtHN7TRJkqSJN4piajfw/vaqvquA5zJzwHsrSZKkybHoab6IuAt4J3BhRMwBfwasAcjMzwD3AtfSfIj4IvD7p6uzkiRJK82ixVRmvneR9gQ+NLIeSZIknUH8BXRJkqQOLKYkSZI6sJiSJEnqwGJKkiSpA4spSZKkDiymJEmSOrCYkiRJ6sBiSpIkqQOLKUmSpA4spiRJkjqwmJIkSerAYkqSJKkDiylJkqQOLKYkSZI6sJiSJEnqwGJKkiSpA4spSZKkDiymJEmSOhiqmIqIqyPisYiYjYiberTfEBFHI+Jb7d8fjr6rkiRJK8/qxWaIiFXAp4FfB+aAhyJid2bur2b9YmbeeBr6KEmStGIN88nU24HZzHw8M48DdwM7Tm+3JEmSzgzDFFObgEPF+Fw7rfY7EbEvIu6JiC0j6Z0kSdIKN6ovoP8LMJOZbwXuA+7sNVNE7IyIPRGx5+jRoyN6akmSpPEZppg6DJSfNG1up/1EZj6dmcfa0duAK3stKDN3Zeb2zNx+0UUXLaW/kiRJK8owxdRDwLaI2BoRa4Hrgd3lDBGxsRi9Dnh0dF2UJElauRa9mi8zT0TEjcBXgVXA7Zn5SER8HNiTmbuBD0fEdcAJ4BnghtPYZ0mSpBVj0WIKIDPvBe6tpn2sGL4ZuHm0XZMkSVr5/AV0SZKkDiymJEmSOrCYkiRJ6sBiSpIkqQOLKUmSpA4spiRJkjqwmJIkSerAYkqSJKkDiylJkqQOLKYkSZI6sJiSJEnqwGJKkiSpA4spSZKkDiymJEmSOrCYkiRJ6sBiSpIkqQOLKUmSpA4spiRJkjqwmJIkSepgqGIqIq6OiMciYjYiburRvi4ivti2PxgRMyPvqSRJ0gq0aDEVEauATwPXAJcD742Iy6vZPgg8m5mXAp8EPjHqjkqSJK1Ew3wy9XZgNjMfz8zjwN3AjmqeHcCd7fA9wLsjIkbXTUmSpJVp9RDzbAIOFeNzwC/1myczT0TEc8AFwFPlTBGxE9jZjv44Ih5bSqdfhwvrPggwLr0Yk96My0LGpDfjspAx6e1Mjcub+jUMU0yNTGbuAnYt1/NFxJ7M3L5cz3emMC4LGZPejMtCxqQ347KQMeltEuMyzGm+w8CWYnxzO63nPBGxGngj8PQoOihJkrSSDVNMPQRsi4itEbEWuB7YXc2zG/hAO/y7wNcyM0fXTUmSpJVp0dN87XegbgS+CqwCbs/MRyLi48CezNwNfBb4fETMAs/QFFwrwbKdUjzDGJeFjElvxmUhY9KbcVnImPQ2cXEJP0CSJElaOn8BXZIkqQOLKUmSpA4mtpha7BY40yAitkTE1yNif0Q8EhEfaaefHxH3RcSB9v+Gcfd1HCJiVUT8V0R8uR3f2t4Oaba9PdLacfdxOUXEeRFxT0R8NyIejYh3mCsQEX/c7j8PR8RdEXH2NOZKRNweEU9GxMPFtJ75EY1PtfHZFxFXjK/np0+fmPxFuw/ti4h/jIjzirab25g8FhG/OZZOL4NecSnaPhoRGREXtuMTkSsTWUwNeQucaXAC+GhmXg5cBXyojcNNwP2ZuQ24vx2fRh8BHi3GPwF8sr0t0rM0t0maJn8D/GtmXgb8Ak1spjpXImIT8GFge2a+heYinOuZzly5A7i6mtYvP64BtrV/O4Fbl6mPy+0OFsbkPuAtmflW4L+BmwHaY+/1wM+3j/nb9rVqEt3BwrgQEVuA3wD+t5g8EbkykcUUw90CZ+Jl5pHM/GY7/DzNi+Mm5t/+507gt8fSwTGKiM3AbwG3teMBvIvmdkgwZXGJiDcCv0pzZS6ZeTwzf4S5As1Vz+e0v6F3LnCEKcyVzPwGzdXapX75sQP4XDYeAM6LiI3L0tFl1CsmmflvmXmiHX2A5rcZoYnJ3Zl5LDO/D8zSvFZNnD65As29e/8EKK98m4hcmdRiqtctcDaNqS8rQkTMAG8DHgQuzswjbdMPgYvH1a8x+muanfpkO34B8KPiIDhtObMVOAr8XXvq87aIWM+U50pmHgb+kuad9BHgOWAv050rpX754TG48QfAV9rhqY5JROwADmfmt6umiYjLpBZTKkTEG4B/AP4oM/+vbGt/XHWqfh8jIt4DPJmZe8fdlxVkNXAFcGtmvg14geqU3pTmygaad85bgUuA9fQ4faHpzI9BIuIWmq9afGHcfRm3iDgX+FPgY+Puy+kyqcXUMLfAmQoRsYamkPpCZn6pnfzEax+jtv+fHFf/xuSXgesi4iDNKeB30Xxf6Lz2VA5MX87MAXOZ+WA7fg9NcTXtufJrwPcz82hmvgJ8iSZ/pjlXSv3yY6qPwRFxA/Ae4H3F3UCmOSY/S/OG5NvtcXcz8M2I+BkmJC6TWkwNcwucidd+D+izwKOZ+VdFU3n7nw8A/7zcfRunzLw5Mzdn5gxNbnwtM98HfJ3mdkgwZXHJzB8ChyLi59pJ7wb2M+W5QnN676qIOLfdn16Ly9TmSqVffuwG3t9eqXUV8FxxOnCiRcTVNF8huC4zXyyadgPXR8S6iNhK84Xr/xxHH5dbZn4nM386M2fa4+4ccEV73JmMXMnMifwDrqW5kuJ7wC3j7s+YYvArNB+77wO+1f5dS/P9oPuBA8C/A+ePu69jjNE7gS+3w2+mObjNAn8PrBt3/5Y5Fr8I7Gnz5Z+ADeZKAvw58F3gYeDzwLppzBXgLprvjb1C82L4wX75AQTNFdXfA75DczXk2NdhmWIyS/MdoNeOuZ8p5r+ljcljwDXj7v9yxqVqPwhcOEm54u1kJEmSOpjU03ySJEnLwmJKkiSpA4spSZKkDiymJEmSOrCYkiRJ6sBiSpIkqQOLKUmSpA7+H0sYkA9rgj+3AAAAAElFTkSuQmCC\n",
      "text/plain": [
       "<Figure size 720x72 with 1 Axes>"
      ]
     },
     "metadata": {
      "needs_background": "light"
     },
     "output_type": "display_data"
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Data set parsing and preparation complete.\n",
      "Data set randomization and splitting complete.\n"
     ]
    }
   ],
   "source": [
    "import matplotlib.pyplot as plt\n",
    "import numpy as np\n",
    "import pandas as pd\n",
    "import tensorflow as tf\n",
    "import os\n",
    "import fileinput\n",
    "\n",
    "print(f\"TensorFlow version = {tf.__version__}\\n\")\n",
    "\n",
    "# Set a fixed random seed value, for reproducibility, this will allow us to get\n",
    "# the same random numbers each time the notebook is run\n",
    "SEED = 1337\n",
    "np.random.seed(SEED)\n",
    "tf.random.set_seed(SEED)\n",
    "\n",
    "CLASSES = [];\n",
    "\n",
    "for file in os.listdir(\"content/\"):\n",
    "    if file.endswith(\".csv\"):\n",
    "        CLASSES.append(os.path.splitext(file)[0])\n",
    "\n",
    "CLASSES.sort()\n",
    "\n",
    "SAMPLES_WINDOW_LEN = 1\n",
    "NUM_CLASSES = len(CLASSES)\n",
    "\n",
    "# create a one-hot encoded matrix that is used in the output\n",
    "ONE_HOT_ENCODED_CLASSES = np.eye(NUM_CLASSES)\n",
    "\n",
    "inputs = []\n",
    "outputs = []\n",
    "\n",
    "# read each csv file and push an input and output\n",
    "for class_index in range(NUM_CLASSES):\n",
    "  objectClass = CLASSES[class_index]\n",
    "  df = pd.read_csv(\"content/\" + objectClass + \".csv\")\n",
    "  columns = list(df)\n",
    "  # get rid of pesky empty value lines of csv which cause NaN inputs to TensorFlow\n",
    "  df = df.dropna()\n",
    "  df = df.reset_index(drop=True)\n",
    "   \n",
    "  # calculate the number of objectClass recordings in the file\n",
    "  num_recordings = int(df.shape[0] / SAMPLES_WINDOW_LEN)\n",
    "  print(f\"\\u001b[32;4m{objectClass}\\u001b[0m class will be output \\u001b[32m{class_index}\\u001b[0m of the classifier\")\n",
    "  print(f\"{num_recordings} samples captured for training with inputs {list(df)} \\n\")\n",
    "\n",
    "  # graphing\n",
    "  plt.rcParams[\"figure.figsize\"] = (10,1)\n",
    "  pixels = np.array([df['Red'],df['Green'],df['Blue']],float)\n",
    "  pixels = np.transpose(pixels)\n",
    "  for i in range(num_recordings):\n",
    "    plt.axvline(x=i, linewidth=8, color=tuple(pixels[i]/np.max(pixels[i], axis=0)))\n",
    "  plt.show()\n",
    "  \n",
    "  #tensors\n",
    "  output = ONE_HOT_ENCODED_CLASSES[class_index]\n",
    "  for i in range(num_recordings):\n",
    "    tensor = []\n",
    "    row = []\n",
    "    for c in columns:\n",
    "      row.append(df[c][i])\n",
    "    tensor += row\n",
    "    inputs.append(tensor)\n",
    "    outputs.append(output)\n",
    "\n",
    "# convert the list to numpy array\n",
    "inputs = np.array(inputs)\n",
    "outputs = np.array(outputs)\n",
    "\n",
    "print(\"Data set parsing and preparation complete.\")\n",
    "\n",
    "# Randomize the order of the inputs, so they can be evenly distributed for training, testing, and validation\n",
    "# https://stackoverflow.com/a/37710486/2020087\n",
    "num_inputs = len(inputs)\n",
    "randomize = np.arange(num_inputs)\n",
    "np.random.shuffle(randomize)\n",
    "\n",
    "# Swap the consecutive indexes (0, 1, 2, etc) with the randomized indexes\n",
    "inputs = inputs[randomize]\n",
    "outputs = outputs[randomize]\n",
    "\n",
    "# Split the recordings (group of samples) into three sets: training, testing and validation\n",
    "TRAIN_SPLIT = int(0.6 * num_inputs)\n",
    "TEST_SPLIT = int(0.2 * num_inputs + TRAIN_SPLIT)\n",
    "\n",
    "inputs_train, inputs_test, inputs_validate = np.split(inputs, [TRAIN_SPLIT, TEST_SPLIT])\n",
    "outputs_train, outputs_test, outputs_validate = np.split(outputs, [TRAIN_SPLIT, TEST_SPLIT])\n",
    "\n",
    "print(\"Data set randomization and splitting complete.\")\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "v8qlSAX1b6Yv"
   },
   "source": [
    "## Build & Train the Model\n",
    "\n",
    "Build and train a [TensorFlow](https://www.tensorflow.org) model using the high-level [Keras](https://www.tensorflow.org/guide/keras) API."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "metadata": {
    "id": "kGNFa-lX24Qo"
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Epoch 1/400\n",
      "22/22 [==============================] - 0s 5ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 2/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 3/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 4/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 5/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 6/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 7/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 8/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 9/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 10/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 11/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 12/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 13/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 14/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 15/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 16/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 17/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 18/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 19/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 20/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 21/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 22/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 23/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 24/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 25/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 26/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 27/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 28/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 29/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 30/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 31/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 32/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 33/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 34/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 35/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 36/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 37/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 38/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 39/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 40/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 41/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 42/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 43/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 44/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 45/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 46/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 47/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 48/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 49/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 50/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 51/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 52/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 53/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 54/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 55/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 56/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 57/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 58/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 59/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 60/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 61/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 62/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 63/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 64/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 65/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 66/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 67/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 68/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 69/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 70/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 71/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 72/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 73/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 74/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 75/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 76/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 77/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 78/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 79/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 80/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 81/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 82/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 83/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 84/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 85/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 86/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 87/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 88/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 89/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 90/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 91/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 92/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 93/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 94/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 95/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 96/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 97/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 98/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 99/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 100/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 101/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 102/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 103/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 104/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 105/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 106/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 107/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 108/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 109/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 110/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 111/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 112/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 113/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 114/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 115/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 116/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 117/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 118/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 119/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 120/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 121/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 122/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 123/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 124/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 125/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 126/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 127/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 128/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 129/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 130/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 131/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 132/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 133/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 134/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 135/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 136/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 137/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 138/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 139/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 140/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 141/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 142/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 143/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 144/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 145/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 146/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 147/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 148/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 149/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 150/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 151/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 152/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 153/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 154/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 155/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 156/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 157/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 158/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 159/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 160/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 161/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 162/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 163/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 164/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 165/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 166/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 167/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 168/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 169/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 170/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 171/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 172/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 173/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 174/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 175/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 176/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 177/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 178/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 179/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 180/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 181/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 182/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 183/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 184/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 185/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 186/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 187/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 188/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 189/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 190/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 191/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 192/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 193/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 194/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 195/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 196/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 197/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 198/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 199/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 200/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 201/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 202/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 203/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 204/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 205/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 206/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 207/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 208/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 209/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 210/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 211/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 212/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 213/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 214/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 215/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 216/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 217/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 218/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 219/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 220/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 221/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 222/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 223/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 224/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 225/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 226/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 227/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 228/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 229/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 230/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 231/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 232/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 233/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 234/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 235/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 236/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 237/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 238/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 239/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 240/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 241/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 242/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 243/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 244/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 245/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 246/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 247/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 248/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 249/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 250/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 251/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 252/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 253/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 254/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 255/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 256/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 257/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 258/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 259/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 260/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 261/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 262/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 263/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 264/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 265/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 266/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 267/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 268/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 269/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 270/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 271/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 272/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 273/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 274/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 275/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 276/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 277/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 278/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 279/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 280/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 281/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 282/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 283/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 284/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 285/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 286/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 287/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 288/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 289/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 290/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 291/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 292/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 293/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 294/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 295/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 296/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 297/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 298/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 299/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 300/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 301/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 302/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 303/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 304/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 305/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 306/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 307/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 308/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 309/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 310/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 311/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 312/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 313/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 314/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 315/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 316/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 317/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 318/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 319/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 320/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 321/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 322/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 323/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 324/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 325/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 326/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 327/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 328/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 329/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 330/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 331/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 332/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 333/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 334/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 335/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 336/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 337/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 338/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 339/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 340/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 341/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 342/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 343/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 344/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 345/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 346/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 347/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 348/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 349/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 350/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 351/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 352/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 353/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 354/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 355/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 356/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 357/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 358/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 359/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 360/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 361/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 362/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 363/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 364/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 365/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 366/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 367/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 368/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 369/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 370/400\n",
      "22/22 [==============================] - 0s 2ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 371/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 372/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 373/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 374/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 375/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 376/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 377/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 378/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 379/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 380/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 381/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 382/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 383/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 384/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 385/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 386/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 387/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 388/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 389/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 390/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 391/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 392/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 393/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 394/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 395/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 396/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 397/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 398/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 399/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n",
      "Epoch 400/400\n",
      "22/22 [==============================] - 0s 1ms/step - loss: 0.0000e+00 - mae: 0.0000e+00 - val_loss: 0.0000e+00 - val_mae: 0.0000e+00\n"
     ]
    }
   ],
   "source": [
    "# build the model and train it\n",
    "model = tf.keras.Sequential()\n",
    "model.add(tf.keras.layers.Dense(8, activation='relu')) # relu is used for performance\n",
    "model.add(tf.keras.layers.Dense(5, activation='relu'))\n",
    "model.add(tf.keras.layers.Dense(NUM_CLASSES, activation='softmax')) # softmax is used, because we only expect one class to occur per input\n",
    "model.compile(optimizer='rmsprop', loss='mse', metrics=['mae'])\n",
    "history = model.fit(inputs_train, outputs_train, epochs=400, batch_size=4, validation_data=(inputs_validate, outputs_validate))\n",
    "\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "guMjtfa42ahM"
   },
   "source": [
    "### Run with Test Data\n",
    "Put our test data into the model and plot the predictions\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "metadata": {
    "id": "V3Y0CCWJz2EK"
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "predictions =\n",
      " [[1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]]\n",
      "actual =\n",
      " [[1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]\n",
      " [1.]]\n"
     ]
    },
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAlkAAABlCAYAAAB+3/a4AAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/Il7ecAAAACXBIWXMAAAsTAAALEwEAmpwYAAAVn0lEQVR4nO3deZRcZZnH8e+PLECzZxERQoIIStA5LG2CZ0ZlXBKCSlhmPAMBAXUQhMMcFcRxAYcRweUI48BIorIdRUBhnIyADhLCImSg2QJEGUPCFkBCwpqQ/Zk/7tvh9qWqq6q7bld15/c5p07qbu993+d9m3q47626igjMzMzMrLk2a3UFzMzMzIYiJ1lmZmZmJXCSZWZmZlYCJ1lmZmZmJXCSZWZmZlYCJ1lmZmZmJXCSZVYCSTdKOrbZ+/aXpJD0joE4V9nybZF0saRvDMA5j5N0R9nnaQeSHpf0kRLKHTJj0KyW4a2ugFm7kPRabrEDWA2sT8ufi4if11tWREwrY9+BImkCsBgYERHrWlydmiLixHr2kzQX+FlE/KTcGg28odw2s8HKSZZZEhFbd7+X9Djw2Yj4fXE/ScMHQ+IxmDimZjYUebrQrAZJB0p6WtIZkp4DLpW0g6TfSFoq6cX0fpfcMXMlfTa9P07SHZK+n/ZdLGlaH/fdTdJtkl6V9HtJF0n6WS91P13Ss5KekfTpwraPSbpf0iuSnpL0zdzm29K/L0l6TdL7JO0uaY6kZZJekPRzSdv3cu6QdKqkRWn/70naLNfOP0g6X9Iy4JuSNk/tflLSX9IU4JZ1tuUySd/KLU+X9EBq22OSDpJ0DvB+4MLUpgvTvu+SdJOk5ZIelfTJXDmjJc1O5dwN7N5Le2+UdEph3YOSDlfmfEnPp7IekvTuKuUcL+mPqY8XSfpcYXtdbZM0IfXB8Nyx+bHWUH/mypgs6TlJw3LrDpM0P72fJOkuSS+l/rpQ0sgqZW2sT1ruMR1bo28OlrQgxWmJpNNq1d1soDnJMqvPW4FRwHjgBLK/nUvT8q7A68CFvRw/GXgUGAN8F/ipJPVh3yuBu4HRwDeBY6qdUNJBwGnAR4E9gOL9NSuATwHbAx8DTpJ0aNr2gfTv9hGxdUTcBQg4F3gbsBcwLtWhN4cBncB+wHQgnxxNBhYBOwLnAOcBewL7AO8AdgbOrLMt+XZPAq4ATk9t+wDweER8DbgdOCW16RRJWwE3kcX1LcA/AP8haWIq7iJgFbBTqnuP5K7gF8CRuXpMJBsf1wNTUj32BLYDPgksq1LO88DHgW2B44HzJe3XaNt6qefGKtJ4fxIR/0s2dj6UW30UWQwhm2L/Atn4fR/wYeDzddSnZ+Vq981PyabxtwHeDcxp9BxmZXOSZVafDcBZEbE6Il6PiGURcW1ErIyIV8mShA/2cvwTEfHjiFgPXE72ob1jI/tK2hV4L3BmRKyJiDuA2b2c85PApRHxcESsoPABGhFzI+KhiNgQEfPJkoSqbYiIhRFxU4rBUuAHNdoM8J2IWB4RTwIXkEtCgGci4t/TNOEqsuT1C2n/V4Fvk32w1mxLwWeAS1JdN0TEkoj4U5V9P06WpFwaEesi4n7gWuDv05WaI8jivSIiHibrj2r+E9hH0vi0PAO4LiJWA2uBbYB3AYqIP0bEs5UKiYjrI+KxyNwK/A/ZVapG29arPvZnt40JpaRtgIPTOiLi3oiYl+L5ODCzgXLzqvZN2r4WmChp24h4MSLu68M5zErlJMusPksjYlX3gqQOSTMlPSHpFbLpte3zUygFz3W/iYiV6e3WDe77NmB5bh3AU73U+W2F7U/kN6Zpn1uUTXm+DJxIdvWhIkk7SroqTc28Avyst/0r1O+JVKdK28aSfdng3jTN9BLw27S+ZlsKxgGP1ahXt/HA5O5zpvPOILtyOZbsvtW6zpsSw+t5IzE8Evh52jaH7ErnRcDzkmZJ2rZSOZKmSZqXpsheIktguuPcSNt61cf+7HYlcLikzYHDgfsi4olU7p7Kps+fS+V+u4Fy83rrG8gS4IOBJyTdKul9fTiHWamcZJnVJwrLXwLeCUyOiG15Y3qt2hRgMzwLjJLUkVs3rsb++e27FrZfSXYlbFxEbAdczBv1L7YXsg/LAN6T2nw0tdtbPP8zueX8OV4gm3LdOyK2T6/tcl9GqNWWvKeofu9UsV1PAbfmztk9PXoSsBRY18B5IV3hSR/4WwC3bDxxxA8jYn9gItm04enFg1PSci3wfWDHiNgeuIE34txI21akf/Pj5a25933pz+62LCBLOKfRc6oQ4EfAn4A9Urlf7aXcFb3Ur7e+ISLuiYjpZFOJvwauqafuZgPJSZZZ32xDlhS8JGkUcFbZJ0xXCrrIbhIfmT7IP9HLIdcAx0mamBKzYh23Ibsytird63NUbttSsinStxf2fw14WdLOVEgSKjhd2ZcExgH/BFxdpW0bgB+T3X/0FgBJO0uaWmdb8n4KHC/pw5I2S+W8K237S6FNvwH2lHSMpBHp9V5Je6Xp2uvI4t2R7gWq9XtmN5BdgTkbuDq1i1TmZEkjyBKLVWTxLRoJbE5K8JR96WFKX9qWpgCXAEdLGqbsywL5BK0v/Zl3JVmffgD4ZaHcV4DXUt1O6qWMB8iuiHUo++2sz+S2Ve2bNP5nSNouItam81WKp1lLOcky65sLgC3JrsDMI5vaGggzyG4mXgZ8iyxpWV1px4i4kayec4CFvPnG4M8DZ0t6lewG82tyx64ku8/sD2mq5gDgX8huYH+ZbFrsujrq+1/AvWQfpteTJQnVnJHqOS9NM/2e7GphPW3ZKCLuJt0wnup6K1niA/BvwN8p++bmD9MU3xSyKb5nyKZqv0OW6ACcQjZV+xxwGdmXHapK919dR3Zjfv7qzrZkSeSLZFeAlgHfq3D8q8CpZH3xIlniOzu3ve62pXX/SJY8LQP2Bu7Mna4v/ZnXfQ/fnIh4Ibf+tFTvV1ObKybWyfnAGrIE8XLS9Gpqa62+OQZ4PI2VE8n+NszaiiIqzQqY2WAg6WrgTxFR+pW0RkkKsimjha2ui5lZK/hKltkgkqZLdk9TRQeR/SzCr1tcLTMzq8C/+G42uLyVbFpnNPA0cFL6aruZmbUZTxeamZmZlcDThWZmZmYlqDldKOkSsl/efT4i3vSsLUki+1bLwcBK4LjuX96VtB54KO36ZEQcUut8Y8aMiQkTJtTdADMzM7NWuffee1+IiLGVttVzT9ZlZL9UfEWV7dPIniW2B9mzyH6U/gV4PSL2aaSyEyZMoKurq5FDzMzMzFpCUtUnQdScLoyI24DlvewyHbgiPWdrHtmjRXZqvJpmZmZmQ0cz7snamZ7P9no6rQPYQlJXeg7XodUKkHRC2q9r6dKlTaiSmZmZWWuVfeP7+IjoJPv13wskVXzmVkTMiojOiOgcO7bitKaZmZnZoNKMJGsJPR+guktaR0R0/7sImAvs24TzmZmZmbW9ZiRZs4FPKXMA8HJEPJseCrs5gKQxwF8DC5pwPjMzM7O2V89POPwCOBAYI+lp4CxgBEBEXEz21PmDyR7aupLs4aUAewEzJW0gS+bOiwgnWWZmZrZJqJlkRcSRNbYHcHKF9XcC7+l71czMzMwGL//iu5mZmVkJnGSZmZmZlcBJlpmZmVkJnGSZmZmZlcBJlpmZmVkJnGSZmZmZlcBJlpmZmVkJnGSZmZmZlcBJlpmZmVkJnGSZmZmZlcBJlpmZmVkJnGSZmZmZlcBJlpmZmVkJnGSZmZmZlcBJlpmZmVkJaiZZki6R9Lykh6tsl6QfSlooab6k/XLbjpX05/Q6tpkVNzMzM2tn9VzJugw4qJft04A90usE4EcAkkYBZwGTgUnAWZJ26E9lzczMzAaLmklWRNwGLO9ll+nAFZGZB2wvaSdgKnBTRCyPiBeBm+g9WRswD826i7lTz+WhWXc1tK0Rt084mmWbjeb2CUf3q5y+uPXoWXSNnsqtR8/qVzlzJp/B4hF7MGfyGU0r957RU1mhDu4ZPbXhY/N9U6ufinXvVm8byuq/Zo2vdqlHs8rJ91e+zGI/VuvXovlbTWa1RjB/q8kb11Xr+/z63tpTa+xWGjPV6pvft9EYvqKt2CDxiraqa/9K6o3jQGiXvwkbetpibEVEzRcwAXi4yrbfAH+TW74Z6AROA76eW/8N4LRa59p///2jTPNn3hkr2DLWMixWsGXMn3lnXdsacdv4GbEBNr5uGz+jWdWvae6MmT3OPXfGzD6Vc/OkL/co5+5RU/pdbrGMu0dNqfvYfN+8zubxOiOr9lOx7jdP+nJE1B+bsvqvWeOrXerRrHKK/bWG4bGWYbGG4T3WP9gxqWK/FhX3e7BjUtW+L65flc5dbE+tsVtpzFQbh8V917BZ3TF8mY4ex75MR7/jXS2OA6Fd/iZs6BnIsQV0RZWcpi1ufJd0gqQuSV1Lly4t9VzLrp3LSNYwnPWMYA3Lrp1b17ZGTHzyRgBUWB4IW914bY9zdy83arf7rutRzsTlt/e73GIZ3cv1KPbNCNZW7adi3buX641NWf3XrPHVLvVoVjnF/hrOOoaznuGs67H+nSvv67HcfVxRcb93rryvat8X149I5y62p9bYrTRmqo3D4r7D2FB3DLdmZY9ju5cbUa1erdAufxM29LTL2GpGkrUEGJdb3iWtq7b+TSJiVkR0RkTn2LFjm1Cl6kYfcSBrGMlahrGWkYw+4sC6tjViwa7TAIjC8kBYMe2IHufuXm7U4v0O71HOglHv73e5xTK6l+tR7Ju1jKjaT8W6dy/XG5uy+q9Z46td6tGscor9tY7hrGUY6xjeY/2jHfv1WO4+rqi436Md+1Xt++L6tencxfbUGruVxky1cVjcdz2b1R3D1+jocWz3ciOq1asV2uVvwoaethlb1S5xRdQ9Xfgx4Eay/zE6ALg7rR8FLAZ2SK/FwKha5yp7ujAiu4x4y5RvV7x82Nu2Rtw2fka8oFEDOlXYbe6MmXHPqCl9nirsdvOkL8ei4e/oMd3W33LvHjUlXmPLhqYKu+X7plY/Feverd42lNV/zRpf7VKPZpWT7698mcV+rNavRQ92TIpVDI8HOyZtXFet7/Pre2tPrbFbacxUq29+30Zj+DIdsb6PU4W16tUK7fI3YUPPQI0tepkuVLa9Okm/AA4ExgB/IfvG4IiUoF0sScCFZDe1rwSOj4iudOynga+mos6JiEtrJX2dnZ3R1dVVR3poZmZm1lqS7o2Izkrbhtc6OCKOrLE9gJOrbLsEuKSeSpqZmZkNJW1x47uZmZnZUOMky8zMzKwETrLMzMzMSuAky8zMzKwETrLMzMzMSuAky8zMzKwETrLMzMzMSuAky8zMzKwETrLMzMzMSuAky8zMzKwETrLMzMzMSuAky8zMzKwETrLMzMzMSuAky8zMzKwETrLMzMzMSuAky8zMzKwEdSVZkg6S9KikhZK+UmH7eEk3S5ovaa6kXXLb1kt6IL1mN7PyZmZmZu1qeK0dJA0DLgI+CjwN3CNpdkQsyO32feCKiLhc0oeAc4Fj0rbXI2Kf5lbbzMzMrL3VcyVrErAwIhZFxBrgKmB6YZ+JwJz0/pYK283MzMw2KfUkWTsDT+WWn07r8h4EDk/vDwO2kTQ6LW8hqUvSPEmHVjqBpBPSPl1Lly6tv/ZmZmZmbapZN76fBnxQ0v3AB4ElwPq0bXxEdAJHARdI2r14cETMiojOiOgcO3Zsk6pkZmZm1jo178kiS5jG5ZZ3Ses2iohnSFeyJG0NHBERL6VtS9K/iyTNBfYFHutvxc3MzMzamSKi9x2k4cD/AR8mS67uAY6KiEdy+4wBlkfEBknnAOsj4kxJOwArI2J12ucuYHrhpvni+ZYCT/S3YW1iDPBCqyvRhhyXyhyXyhyXN3NMKnNcKnNcKmtWXMZHRMVpuJpXsiJinaRTgN8Bw4BLIuIRSWcDXRExGzgQOFdSALcBJ6fD9wJmStpANjV5Xm8JVjrfkJkvlNSVpkotx3GpzHGpzHF5M8ekMselMselsoGISz3ThUTEDcANhXVn5t7/CvhVhePuBN7TzzqamZmZDTr+xXczMzOzEjjJKtesVlegTTkulTkulTkub+aYVOa4VOa4VFZ6XGre+G5mZmZmjfOVLDMzM7MSOMkyMzMzK4GTrD6SdJCkRyUtlPSVCtu/KGmBpPmSbpY0PrftWEl/Tq9jB7bm5elnTNZLeiC9Zg9szctVR1xOlPRQavsdkibmtv1zOu5RSVMHtubl6mtcJE2Q9HpuvFw88LUvT6245PY7QlJI6syt22THS26/HnHZ1MeLpOMkLc21/7O5bUPyswj6HZfmfR5FhF8Nvsh+L+wx4O3ASLJnN04s7PO3QEd6fxJwdXo/CliU/t0hvd+h1W1qZUzS8mutbkML47Jt7v0hwG/T+4lp/82B3VI5w1rdpjaIywTg4Va3oVVxSfttQ/abhPOATo+XXuOySY8X4DjgwgrHDsnPov7GJW1r2ueRr2T1zSRgYUQsiog1wFXA9PwOEXFLRKxMi/PIHkcEMBW4KSKWR8SLwE3AQQNU7zL1JyZDWT1xeSW3uBXQ/W2U6cBVEbE6IhYDC1N5Q0F/4jKU1YxL8q/Ad4BVuXWb9HhJKsVlKKs3LpUM1c8i6F9cmspJVt/sDDyVW346ravmM8CNfTx2sOhPTAC2kNQlaZ6kQ0uoX6vUFRdJJ0t6DPgucGojxw5S/YkLwG6S7pd0q6T3l1vVAVUzLpL2A8ZFxPWNHjuI9ScusAmPl+SIdJvGryR1P4t4kx4vSaW4QBM/j5xklUzS0UAn8L1W16VdVInJ+Mgeb3AUcIGk3VtSuRaJiIsiYnfgDODrra5Pu6gSl2eBXSNiX+CLwJWStm1VHQeSpM2AHwBfanVd2kmNuGyy4yX5b2BCRPwV2dWqy1tcn3bRW1ya9nnkJKtvlgD5rHeXtK4HSR8BvgYcEhGrGzl2EOpPTIiIJenfRcBcYN8yKzuAGu3vq4BD+3jsYNLnuKTpsGXp/b1k917sWU41B1ytuGwDvBuYK+lx4ABgdrrJe1MeL1XjsomPFyJiWe6/tT8B9q/32EGsP3Fp7udRq29QG4wvsmc+LiK7ubT7prq9C/vsS/bHvEdh/ShgMdmNhjuk96Na3aYWx2QHYPP0fgzwZyrc1DoYX3XGZY/c+0+QPXgdYG963si8iKFzI3N/4jK2Ow5kN7YuGQp/Q/XGpbD/XN64wXuTHi+9xGWTHi/ATrn3hwHz0vsh+VnUhLg09fOorgdEW08RsU7SKcDvyL7FcElEPCLpbLIPgtlkU2FbA7+UBPBkRBwSEcsl/StwTyru7IhY3oJmNFV/YgLsBcyUtIHs6up5EbGgJQ1psjrjckq6wrcWeBE4Nh37iKRrgAXAOuDkiFjfkoY0WX/iAnwAOFvSWmADcOJQ+BuCuuNS7dhNfbxUs6mPl1MlHUI2JpaTfauOofpZBP2LC03+PPJjdczMzMxK4HuyzMzMzErgJMvMzMysBE6yzMzMzErgJMvMzMysBE6yzMzMzErgJMvMzMysBE6yzMzMzErw/3RPm2ekjAmEAAAAAElFTkSuQmCC\n",
      "text/plain": [
       "<Figure size 720x72 with 1 Axes>"
      ]
     },
     "metadata": {
      "needs_background": "light"
     },
     "output_type": "display_data"
    }
   ],
   "source": [
    "# use the model to predict the test inputs\n",
    "predictions = model.predict(inputs_test)\n",
    "\n",
    "# print the predictions and the expected ouputs\n",
    "print(\"predictions =\\n\", np.round(predictions, decimals=3))\n",
    "print(\"actual =\\n\", outputs_test)\n",
    "\n",
    "# Plot the predictions along with to the test data\n",
    "plt.clf()\n",
    "plt.title('Training data predicted vs actual values')\n",
    "plt.plot(inputs_test, outputs_test, 'b.', label='Actual')\n",
    "plt.plot(inputs_test, predictions, 'r.', label='Predicted')\n",
    "plt.show()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "j7DO6xxXVCym"
   },
   "source": [
    "# Convert the Trained Model to Tensor Flow Lite\n",
    "\n",
    "The next cell converts the model to TFlite format. The size in bytes of the model is also printed out."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "metadata": {
    "id": "0Xn1-Rn9Cp_8"
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "WARNING:tensorflow:From /Users/xiao/opt/miniconda3/envs/tflite/lib/python3.8/site-packages/tensorflow/python/training/tracking/tracking.py:111: Model.state_updates (from tensorflow.python.keras.engine.training) is deprecated and will be removed in a future version.\n",
      "Instructions for updating:\n",
      "This property should not be used in TensorFlow 2.0, as updates are applied automatically.\n",
      "WARNING:tensorflow:From /Users/xiao/opt/miniconda3/envs/tflite/lib/python3.8/site-packages/tensorflow/python/training/tracking/tracking.py:111: Layer.updates (from tensorflow.python.keras.engine.base_layer) is deprecated and will be removed in a future version.\n",
      "Instructions for updating:\n",
      "This property should not be used in TensorFlow 2.0, as updates are applied automatically.\n",
      "INFO:tensorflow:Assets written to: /var/folders/8m/sl9ns0293bv6ld4590r_j7fr0000gn/T/tmp5cnkqwqd/assets\n",
      "Model is 1916 bytes\n"
     ]
    }
   ],
   "source": [
    "# Convert the model to the TensorFlow Lite format without quantization\n",
    "converter = tf.lite.TFLiteConverter.from_keras_model(model)\n",
    "tflite_model = converter.convert()\n",
    "\n",
    "# Save the model to disk\n",
    "open(\"gesture_model.tflite\", \"wb\").write(tflite_model)\n",
    "  \n",
    "import os\n",
    "basic_model_size = os.path.getsize(\"gesture_model.tflite\")\n",
    "print(\"Model is %d bytes\" % basic_model_size)\n",
    "  \n",
    "  "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "ykccQn7SXrUX"
   },
   "source": [
    "## Encode the Model in an Arduino Header File \n",
    "\n",
    "The next cell creates a constant byte array that contains the TFlite model. Import it as a tab with the sketch below."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "metadata": {
    "id": "9J33uwpNtAku"
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Header file, model.h, is 11,850 bytes.\n",
      "\n",
      "Open the side panel (refresh if needed). Double click model.h to download the file.\n"
     ]
    }
   ],
   "source": [
    "!echo \"const unsigned char model[] = {\" > content/model.h\n",
    "!cat gesture_model.tflite | xxd -i      >> content/model.h\n",
    "!echo \"};\"                              >> content/model.h\n",
    "\n",
    "import os\n",
    "model_h_size = os.path.getsize(\"content/model.h\")\n",
    "print(f\"Header file, model.h, is {model_h_size:,} bytes.\")\n",
    "print(\"\\nOpen the side panel (refresh if needed). Double click model.h to download the file.\")"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "id": "1eSkHZaLzMId"
   },
   "source": [
    "# Realtime Classification of Sensor Data on Arduino\n",
    "\n",
    "Now it's time to switch back to the tutorial instructions and run our new model on the [Arduino Nano 33 BLE Sense](https://www.arduino.cc/en/Guide/NANO33BLE)"
   ]
  }
 ],
 "metadata": {
  "colab": {
   "collapsed_sections": [],
   "name": "FruitToEmoji-GIT.ipynb",
   "provenance": [],
   "toc_visible": true
  },
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.8.5"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 4
}
