{
  "nbformat": 4,
  "nbformat_minor": 0,
  "metadata": {
    "kernelspec": {
      "name": "python3",
      "display_name": "Python 3"
    },
    "colab": {
      "name": "DDQN.ipynb",
      "provenance": [],
      "collapsed_sections": []
    }
  },
  "cells": [
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "nFCDYptQl7xw",
        "colab_type": "text"
      },
      "source": [
        "# Double Deep Q-Network"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "qmKwkKc8l7x2",
        "colab_type": "text"
      },
      "source": [
        "In 2016, Google DeepMind ([Link](https://www.aaai.org/ocs/index.php/AAAI/AAAI16/paper/download/12389/11847)) decided to alter the DQN algorithm the same way the original Q-Learner algorithm was updated by adding a second network. The team found that the overestimation that affected the q-learner also affected the DQN algoritm. When the original double q-learner was introduced they proved that it worked in that setting and in this paper they prove it can be generalized to work with large scale function approximation."
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "collapsed": true,
        "id": "03jJ9lXgl7x4",
        "colab_type": "text"
      },
      "source": [
        "**Summary**  \n",
        "This algorithm combines the benefits of the Double Q-Learner as the benefits of deep learning. We get the function approximation so that we can have a continuous state space plus we keep from having the overestimation from Q-Learning. This document will be a combination of the DQN and Double Q-Learner so it will be mostly review. But, this algorithm is so much more powerful that you should explore the other gyms at OpenAI and see what you can solve.  \n",
        "\n",
        "One thing to note, we need to update the weights of the second neural network with the weights from the first neural network. We didn't do this in the Double Q-Learner since the tables were both getting updated."
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "6I3zr2ajl7x7",
        "colab_type": "text"
      },
      "source": [
        "**CartPole Example**  \n",
        "Again we will use the [CartPole](https://gym.openai.com/envs/CartPole-v1/) environment from OpenAI.  \n",
        "\n",
        "The actions are 0 to push the cart to the left and 1 to push the cart to the right.  \n",
        "\n",
        "The continuous state space is an X coordinate for location, the velocity of the cart, the angle of the pole, and the velocity at the tip of the pole. The X coordinate goes from -4.8 to +4.8, velocity is -Inf to +Inf, angle of the pole goes from -24 degrees to +24 degrees, tip velocity is -Inf to +Inf. With all of the possible combinations you can see why we can't create a Q table for each one.  \n",
        "\n",
        "To \"solve\" this puzzle you have to have an average reward of > 195 over 100 consecutive episodes. One thing to note, I am hard capping the rewards at 210 so this number can't average above that and it also could potentially drive the average down."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "-lZPwAxel7x9",
        "colab_type": "code",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 34
        },
        "outputId": "dc65059c-2996-4b1b-9c4e-3c08c70cbb29"
      },
      "source": [
        "#Imports and gym creation\n",
        "import gym\n",
        "import numpy as np\n",
        "import matplotlib.pyplot as plt\n",
        "from collections import deque\n",
        "import tensorflow as tf\n",
        "from tensorflow import keras\n",
        "#from keras.models import Sequential\n",
        "#from keras.layers import Dense\n",
        "#from keras.optimizers import Adam\n",
        "import random\n",
        "\n",
        "#Create Gym\n",
        "from gym import wrappers\n",
        "envCartPole = gym.make('CartPole-v1')\n",
        "envCartPole.seed(50)"
      ],
      "execution_count": 1,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "[50]"
            ]
          },
          "metadata": {
            "tags": []
          },
          "execution_count": 1
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "5VGDdEytl7yK",
        "colab_type": "code",
        "colab": {}
      },
      "source": [
        "EPISODES = 500\n",
        "TRAIN_END = 0"
      ],
      "execution_count": 2,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "BtS6cFn3l7yS",
        "colab_type": "code",
        "colab": {}
      },
      "source": [
        "def discount_rate(): #Gamma\n",
        "    return 0.95\n",
        "\n",
        "def learning_rate(): #Alpha\n",
        "    return 0.001\n",
        "\n",
        "def batch_size():\n",
        "    return 24"
      ],
      "execution_count": 3,
      "outputs": []
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "UR9pYntLl7ym",
        "colab_type": "text"
      },
      "source": [
        "**Double Deep Q-Network Class**  \n",
        "This class is the same as the DQN class from the last notebook with a few exceptions.  \n",
        "**init**:  \n",
        "&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;We create a second NN for the target network  \n",
        "\n",
        "**update_target_from_model(self)**  \n",
        "&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;This class updates the weights of the target NN from the model NN\n",
        "\n",
        "**build_model(self)**:  \n",
        "**action(self,state)**:  \n",
        "**test_action(self,state)**:  \n",
        "**store(self, state, action, reward, nstate, done)**:  \n",
        "&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;Same  \n",
        "\n",
        "**experience_replay(self, batch_size)**:  \n",
        "&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;This class has the Double DQN changes. We grab the prediction targets from the target NN and then use that in the Q update rule.    "
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "xDZtfUW0l7yn",
        "colab_type": "code",
        "colab": {}
      },
      "source": [
        "class DoubleDeepQNetwork():\n",
        "    def __init__(self, states, actions, alpha, gamma, epsilon,epsilon_min, epsilon_decay):\n",
        "        self.nS = states\n",
        "        self.nA = actions\n",
        "        self.memory = deque([], maxlen=2500)\n",
        "        self.alpha = alpha\n",
        "        self.gamma = gamma\n",
        "        #Explore/Exploit\n",
        "        self.epsilon = epsilon\n",
        "        self.epsilon_min = epsilon_min\n",
        "        self.epsilon_decay = epsilon_decay\n",
        "        self.model = self.build_model()\n",
        "        self.model_target = self.build_model() #Second (target) neural network\n",
        "        self.update_target_from_model() #Update weights\n",
        "        self.loss = []\n",
        "        \n",
        "    def build_model(self):\n",
        "        model = keras.Sequential() #linear stack of layers https://keras.io/models/sequential/\n",
        "        model.add(keras.layers.Dense(24, input_dim=self.nS, activation='relu')) #[Input] -> Layer 1\n",
        "        #   Dense: Densely connected layer https://keras.io/layers/core/\n",
        "        #   24: Number of neurons\n",
        "        #   input_dim: Number of input variables\n",
        "        #   activation: Rectified Linear Unit (relu) ranges >= 0\n",
        "        model.add(keras.layers.Dense(24, activation='relu')) #Layer 2 -> 3\n",
        "        model.add(keras.layers.Dense(self.nA, activation='linear')) #Layer 3 -> [output]\n",
        "        #   Size has to match the output (different actions)\n",
        "        #   Linear activation on the last layer\n",
        "        model.compile(loss='mean_squared_error', #Loss function: Mean Squared Error\n",
        "                      optimizer=keras.optimizers.Adam(lr=self.alpha)) #Optimaizer: Adam (Feel free to check other options)\n",
        "        return model\n",
        "\n",
        "    def update_target_from_model(self):\n",
        "        #Update the target model from the base model\n",
        "        self.model_target.set_weights( self.model.get_weights() )\n",
        "\n",
        "    def action(self, state):\n",
        "        if np.random.rand() <= self.epsilon:\n",
        "            return random.randrange(self.nA) #Explore\n",
        "        action_vals = self.model.predict(state) #Exploit: Use the NN to predict the correct action from this state\n",
        "        return np.argmax(action_vals[0])\n",
        "\n",
        "    def test_action(self, state): #Exploit\n",
        "        action_vals = self.model.predict(state)\n",
        "        return np.argmax(action_vals[0])\n",
        "\n",
        "    def store(self, state, action, reward, nstate, done):\n",
        "        #Store the experience in memory\n",
        "        self.memory.append( (state, action, reward, nstate, done) )\n",
        "\n",
        "    def experience_replay(self, batch_size):\n",
        "        #Execute the experience replay\n",
        "        minibatch = random.sample( self.memory, batch_size ) #Randomly sample from memory\n",
        "\n",
        "        #Convert to numpy for speed by vectorization\n",
        "        x = []\n",
        "        y = []\n",
        "        np_array = np.array(minibatch)\n",
        "        st = np.zeros((0,self.nS)) #States\n",
        "        nst = np.zeros( (0,self.nS) )#Next States\n",
        "        for i in range(len(np_array)): #Creating the state and next state np arrays\n",
        "            st = np.append( st, np_array[i,0], axis=0)\n",
        "            nst = np.append( nst, np_array[i,3], axis=0)\n",
        "        st_predict = self.model.predict(st) #Here is the speedup! I can predict on the ENTIRE batch\n",
        "        nst_predict = self.model.predict(nst)\n",
        "        nst_predict_target = self.model_target.predict(nst) #Predict from the TARGET\n",
        "        index = 0\n",
        "        for state, action, reward, nstate, done in minibatch:\n",
        "            x.append(state)\n",
        "            #Predict from state\n",
        "            nst_action_predict_target = nst_predict_target[index]\n",
        "            nst_action_predict_model = nst_predict[index]\n",
        "            if done == True: #Terminal: Just assign reward much like {* (not done) - QB[state][action]}\n",
        "                target = reward\n",
        "            else:   #Non terminal\n",
        "                target = reward + self.gamma * nst_action_predict_target[np.argmax(nst_action_predict_model)] #Using Q to get T is Double DQN\n",
        "            target_f = st_predict[index]\n",
        "            target_f[action] = target\n",
        "            y.append(target_f)\n",
        "            index += 1\n",
        "        #Reshape for Keras Fit\n",
        "        x_reshape = np.array(x).reshape(batch_size,self.nS)\n",
        "        y_reshape = np.array(y)\n",
        "        epoch_count = 1\n",
        "        hist = self.model.fit(x_reshape, y_reshape, epochs=epoch_count, verbose=0)\n",
        "        #Graph Losses\n",
        "        for i in range(epoch_count):\n",
        "            self.loss.append( hist.history['loss'][i] )\n",
        "        #Decay Epsilon\n",
        "        if self.epsilon > self.epsilon_min:\n",
        "            self.epsilon *= self.epsilon_decay"
      ],
      "execution_count": 4,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "-KiANrlYl7ys",
        "colab_type": "code",
        "colab": {}
      },
      "source": [
        "#Create the agents\n",
        "nS = envCartPole.observation_space.shape[0] #This is only 4\n",
        "nA = envCartPole.action_space.n #Actions\n",
        "dqn = DoubleDeepQNetwork(nS, nA, learning_rate(), discount_rate(), 1, 0.001, 0.995 )\n",
        "\n",
        "batch_size = batch_size()"
      ],
      "execution_count": 5,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "6XdxLERIl7yy",
        "colab_type": "code",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 1000
        },
        "outputId": "c68f3d5d-36d5-4f74-ccc6-9fa7cd5ae6e4"
      },
      "source": [
        "#Training\n",
        "rewards = [] #Store rewards for graphing\n",
        "epsilons = [] # Store the Explore/Exploit\n",
        "TEST_Episodes = 0\n",
        "for e in range(EPISODES):\n",
        "    state = envCartPole.reset()\n",
        "    state = np.reshape(state, [1, nS]) # Resize to store in memory to pass to .predict\n",
        "    tot_rewards = 0\n",
        "    for time in range(210): #200 is when you \"solve\" the game. This can continue forever as far as I know\n",
        "        action = dqn.action(state)\n",
        "        nstate, reward, done, _ = envCartPole.step(action)\n",
        "        nstate = np.reshape(nstate, [1, nS])\n",
        "        tot_rewards += reward\n",
        "        dqn.store(state, action, reward, nstate, done) # Resize to store in memory to pass to .predict\n",
        "        state = nstate\n",
        "        #done: CartPole fell. \n",
        "        #time == 209: CartPole stayed upright\n",
        "        if done or time == 209:\n",
        "            rewards.append(tot_rewards)\n",
        "            epsilons.append(dqn.epsilon)\n",
        "            print(\"episode: {}/{}, score: {}, e: {}\"\n",
        "                  .format(e, EPISODES, tot_rewards, dqn.epsilon))\n",
        "            break\n",
        "        #Experience Replay\n",
        "        if len(dqn.memory) > batch_size:\n",
        "            dqn.experience_replay(batch_size)\n",
        "    #Update the weights after each episode (You can configure this for x steps as well\n",
        "    dqn.update_target_from_model()\n",
        "    #If our current NN passes we are done\n",
        "    #I am going to use the last 5 runs\n",
        "    if len(rewards) > 5 and np.average(rewards[-5:]) > 195:\n",
        "        #Set the rest of the EPISODES for testing\n",
        "        TEST_Episodes = EPISODES - e\n",
        "        TRAIN_END = e\n",
        "        break"
      ],
      "execution_count": 6,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "episode: 0/500, score: 15.0, e: 1\n",
            "episode: 1/500, score: 32.0, e: 0.8955869907338783\n",
            "episode: 2/500, score: 17.0, e: 0.8265651079747222\n",
            "episode: 3/500, score: 10.0, e: 0.7901049725470279\n",
            "episode: 4/500, score: 19.0, e: 0.7219385759785162\n",
            "episode: 5/500, score: 13.0, e: 0.6797938283326578\n",
            "episode: 6/500, score: 13.0, e: 0.6401093727576664\n",
            "episode: 7/500, score: 25.0, e: 0.567555222460375\n",
            "episode: 8/500, score: 12.0, e: 0.5371084840724134\n",
            "episode: 9/500, score: 10.0, e: 0.5134164023722473\n",
            "episode: 10/500, score: 9.0, e: 0.4932355662165453\n",
            "episode: 11/500, score: 13.0, e: 0.46444185833082485\n",
            "episode: 12/500, score: 12.0, e: 0.43952667968844233\n",
            "episode: 13/500, score: 9.0, e: 0.4222502236424958\n",
            "episode: 14/500, score: 10.0, e: 0.4036245882390106\n",
            "episode: 15/500, score: 12.0, e: 0.3819719776053028\n",
            "episode: 16/500, score: 9.0, e: 0.3669578217261671\n",
            "episode: 17/500, score: 10.0, e: 0.3507711574848344\n",
            "episode: 18/500, score: 8.0, e: 0.3386767948568688\n",
            "episode: 19/500, score: 9.0, e: 0.3253644408394192\n",
            "episode: 20/500, score: 10.0, e: 0.31101247816653554\n",
            "episode: 21/500, score: 11.0, e: 0.29580711868545667\n",
            "episode: 22/500, score: 14.0, e: 0.27714603575484437\n",
            "episode: 23/500, score: 10.0, e: 0.2649210072611673\n",
            "episode: 24/500, score: 8.0, e: 0.25578670228422234\n",
            "episode: 25/500, score: 13.0, e: 0.2408545925762412\n",
            "episode: 26/500, score: 15.0, e: 0.22453190559909803\n",
            "episode: 27/500, score: 10.0, e: 0.21462770857094118\n",
            "episode: 28/500, score: 8.0, e: 0.20722748400265262\n",
            "episode: 29/500, score: 14.0, e: 0.19415447453059972\n",
            "episode: 30/500, score: 12.0, e: 0.18373897616330553\n",
            "episode: 31/500, score: 14.0, e: 0.17214774642209296\n",
            "episode: 32/500, score: 11.0, e: 0.16373146555890544\n",
            "episode: 33/500, score: 10.0, e: 0.15650920157696743\n",
            "episode: 34/500, score: 11.0, e: 0.14885748713096328\n",
            "episode: 35/500, score: 14.0, e: 0.13946676683816583\n",
            "episode: 36/500, score: 11.0, e: 0.13264825480308728\n",
            "episode: 37/500, score: 8.0, e: 0.12807462877562611\n",
            "episode: 38/500, score: 9.0, e: 0.12304040492325048\n",
            "episode: 39/500, score: 10.0, e: 0.1176130407830293\n",
            "episode: 40/500, score: 15.0, e: 0.10964241905397228\n",
            "episode: 41/500, score: 21.0, e: 0.09918368135888474\n",
            "episode: 42/500, score: 9.0, e: 0.09528507271768329\n",
            "episode: 43/500, score: 18.0, e: 0.08750185146499175\n",
            "episode: 44/500, score: 23.0, e: 0.07836551983717477\n",
            "episode: 45/500, score: 16.0, e: 0.07268942442628039\n",
            "episode: 46/500, score: 16.0, e: 0.06742445445908266\n",
            "episode: 47/500, score: 14.0, e: 0.06317096204211972\n",
            "episode: 48/500, score: 16.0, e: 0.058595424120670696\n",
            "episode: 49/500, score: 29.0, e: 0.05092252885731386\n",
            "episode: 50/500, score: 9.0, e: 0.04892091923449087\n",
            "episode: 51/500, score: 8.0, e: 0.047234157581800176\n",
            "episode: 52/500, score: 45.0, e: 0.0378853869148274\n",
            "episode: 53/500, score: 11.0, e: 0.036033175291307735\n",
            "episode: 54/500, score: 10.0, e: 0.034443736736092176\n",
            "episode: 55/500, score: 134.0, e: 0.017683979399301233\n",
            "episode: 56/500, score: 37.0, e: 0.01476423357148172\n",
            "episode: 57/500, score: 50.0, e: 0.01154893304942575\n",
            "episode: 58/500, score: 36.0, e: 0.009690578183705511\n",
            "episode: 59/500, score: 47.0, e: 0.0076950492560818795\n",
            "episode: 60/500, score: 23.0, e: 0.006891586006803337\n",
            "episode: 61/500, score: 31.0, e: 0.005929411657474116\n",
            "episode: 62/500, score: 43.0, e: 0.004803756600407726\n",
            "episode: 63/500, score: 44.0, e: 0.003872339856794843\n",
            "episode: 64/500, score: 57.0, e: 0.002924596716205228\n",
            "episode: 65/500, score: 92.0, e: 0.0018533879534277063\n",
            "episode: 66/500, score: 69.0, e: 0.0013180619611012898\n",
            "episode: 67/500, score: 77.0, e: 0.0009954703940636294\n",
            "episode: 68/500, score: 63.0, e: 0.0009954703940636294\n",
            "episode: 69/500, score: 57.0, e: 0.0009954703940636294\n",
            "episode: 70/500, score: 52.0, e: 0.0009954703940636294\n",
            "episode: 71/500, score: 50.0, e: 0.0009954703940636294\n",
            "episode: 72/500, score: 88.0, e: 0.0009954703940636294\n",
            "episode: 73/500, score: 121.0, e: 0.0009954703940636294\n",
            "episode: 74/500, score: 63.0, e: 0.0009954703940636294\n",
            "episode: 75/500, score: 105.0, e: 0.0009954703940636294\n",
            "episode: 76/500, score: 90.0, e: 0.0009954703940636294\n",
            "episode: 77/500, score: 84.0, e: 0.0009954703940636294\n",
            "episode: 78/500, score: 77.0, e: 0.0009954703940636294\n",
            "episode: 79/500, score: 130.0, e: 0.0009954703940636294\n",
            "episode: 80/500, score: 118.0, e: 0.0009954703940636294\n",
            "episode: 81/500, score: 76.0, e: 0.0009954703940636294\n",
            "episode: 82/500, score: 93.0, e: 0.0009954703940636294\n",
            "episode: 83/500, score: 102.0, e: 0.0009954703940636294\n",
            "episode: 84/500, score: 89.0, e: 0.0009954703940636294\n",
            "episode: 85/500, score: 210.0, e: 0.0009954703940636294\n",
            "episode: 86/500, score: 137.0, e: 0.0009954703940636294\n",
            "episode: 87/500, score: 155.0, e: 0.0009954703940636294\n",
            "episode: 88/500, score: 178.0, e: 0.0009954703940636294\n",
            "episode: 89/500, score: 186.0, e: 0.0009954703940636294\n",
            "episode: 90/500, score: 156.0, e: 0.0009954703940636294\n",
            "episode: 91/500, score: 210.0, e: 0.0009954703940636294\n",
            "episode: 92/500, score: 210.0, e: 0.0009954703940636294\n",
            "episode: 93/500, score: 210.0, e: 0.0009954703940636294\n",
            "episode: 94/500, score: 210.0, e: 0.0009954703940636294\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "sOJOCs1zl7y4",
        "colab_type": "code",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 1000
        },
        "outputId": "3b877c4f-1c77-46ca-d9bb-8dd8945edff4"
      },
      "source": [
        "#Testing\n",
        "print('Training complete. Testing started...')\n",
        "TEST_Episodes=100\n",
        "#TEST Time\n",
        "#   In this section we ALWAYS use exploit don't train any more\n",
        "for e_test in range(TEST_Episodes):\n",
        "    state = envCartPole.reset()\n",
        "    state = np.reshape(state, [1, nS])\n",
        "    tot_rewards = 0\n",
        "    for t_test in range(210):\n",
        "        action = dqn.test_action(state)\n",
        "        nstate, reward, done, _ = envCartPole.step(action)\n",
        "        nstate = np.reshape( nstate, [1, nS])\n",
        "        tot_rewards += reward\n",
        "        #DON'T STORE ANYTHING DURING TESTING\n",
        "        state = nstate\n",
        "        #done: CartPole fell. \n",
        "        #t_test == 209: CartPole stayed upright\n",
        "        if done or t_test == 209: \n",
        "            rewards.append(tot_rewards)\n",
        "            epsilons.append(0) #We are doing full exploit\n",
        "            print(\"episode: {}/{}, score: {}, e: {}\"\n",
        "                  .format(e_test, TEST_Episodes, tot_rewards, 0))\n",
        "            break;"
      ],
      "execution_count": 10,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Training complete. Testing started...\n",
            "episode: 0/100, score: 210.0, e: 0\n",
            "episode: 1/100, score: 210.0, e: 0\n",
            "episode: 2/100, score: 210.0, e: 0\n",
            "episode: 3/100, score: 210.0, e: 0\n",
            "episode: 4/100, score: 210.0, e: 0\n",
            "episode: 5/100, score: 210.0, e: 0\n",
            "episode: 6/100, score: 210.0, e: 0\n",
            "episode: 7/100, score: 210.0, e: 0\n",
            "episode: 8/100, score: 210.0, e: 0\n",
            "episode: 9/100, score: 210.0, e: 0\n",
            "episode: 10/100, score: 210.0, e: 0\n",
            "episode: 11/100, score: 210.0, e: 0\n",
            "episode: 12/100, score: 210.0, e: 0\n",
            "episode: 13/100, score: 210.0, e: 0\n",
            "episode: 14/100, score: 210.0, e: 0\n",
            "episode: 15/100, score: 210.0, e: 0\n",
            "episode: 16/100, score: 210.0, e: 0\n",
            "episode: 17/100, score: 210.0, e: 0\n",
            "episode: 18/100, score: 210.0, e: 0\n",
            "episode: 19/100, score: 210.0, e: 0\n",
            "episode: 20/100, score: 210.0, e: 0\n",
            "episode: 21/100, score: 210.0, e: 0\n",
            "episode: 22/100, score: 210.0, e: 0\n",
            "episode: 23/100, score: 210.0, e: 0\n",
            "episode: 24/100, score: 210.0, e: 0\n",
            "episode: 25/100, score: 210.0, e: 0\n",
            "episode: 26/100, score: 210.0, e: 0\n",
            "episode: 27/100, score: 210.0, e: 0\n",
            "episode: 28/100, score: 210.0, e: 0\n",
            "episode: 29/100, score: 210.0, e: 0\n",
            "episode: 30/100, score: 210.0, e: 0\n",
            "episode: 31/100, score: 210.0, e: 0\n",
            "episode: 32/100, score: 210.0, e: 0\n",
            "episode: 33/100, score: 210.0, e: 0\n",
            "episode: 34/100, score: 210.0, e: 0\n",
            "episode: 35/100, score: 210.0, e: 0\n",
            "episode: 36/100, score: 210.0, e: 0\n",
            "episode: 37/100, score: 210.0, e: 0\n",
            "episode: 38/100, score: 210.0, e: 0\n",
            "episode: 39/100, score: 210.0, e: 0\n",
            "episode: 40/100, score: 210.0, e: 0\n",
            "episode: 41/100, score: 210.0, e: 0\n",
            "episode: 42/100, score: 210.0, e: 0\n",
            "episode: 43/100, score: 210.0, e: 0\n",
            "episode: 44/100, score: 210.0, e: 0\n",
            "episode: 45/100, score: 210.0, e: 0\n",
            "episode: 46/100, score: 210.0, e: 0\n",
            "episode: 47/100, score: 210.0, e: 0\n",
            "episode: 48/100, score: 210.0, e: 0\n",
            "episode: 49/100, score: 210.0, e: 0\n",
            "episode: 50/100, score: 210.0, e: 0\n",
            "episode: 51/100, score: 210.0, e: 0\n",
            "episode: 52/100, score: 210.0, e: 0\n",
            "episode: 53/100, score: 210.0, e: 0\n",
            "episode: 54/100, score: 210.0, e: 0\n",
            "episode: 55/100, score: 210.0, e: 0\n",
            "episode: 56/100, score: 210.0, e: 0\n",
            "episode: 57/100, score: 210.0, e: 0\n",
            "episode: 58/100, score: 210.0, e: 0\n",
            "episode: 59/100, score: 210.0, e: 0\n",
            "episode: 60/100, score: 210.0, e: 0\n",
            "episode: 61/100, score: 210.0, e: 0\n",
            "episode: 62/100, score: 210.0, e: 0\n",
            "episode: 63/100, score: 210.0, e: 0\n",
            "episode: 64/100, score: 210.0, e: 0\n",
            "episode: 65/100, score: 210.0, e: 0\n",
            "episode: 66/100, score: 210.0, e: 0\n",
            "episode: 67/100, score: 210.0, e: 0\n",
            "episode: 68/100, score: 210.0, e: 0\n",
            "episode: 69/100, score: 210.0, e: 0\n",
            "episode: 70/100, score: 210.0, e: 0\n",
            "episode: 71/100, score: 210.0, e: 0\n",
            "episode: 72/100, score: 210.0, e: 0\n",
            "episode: 73/100, score: 210.0, e: 0\n",
            "episode: 74/100, score: 210.0, e: 0\n",
            "episode: 75/100, score: 210.0, e: 0\n",
            "episode: 76/100, score: 210.0, e: 0\n",
            "episode: 77/100, score: 210.0, e: 0\n",
            "episode: 78/100, score: 210.0, e: 0\n",
            "episode: 79/100, score: 210.0, e: 0\n",
            "episode: 80/100, score: 210.0, e: 0\n",
            "episode: 81/100, score: 210.0, e: 0\n",
            "episode: 82/100, score: 210.0, e: 0\n",
            "episode: 83/100, score: 210.0, e: 0\n",
            "episode: 84/100, score: 210.0, e: 0\n",
            "episode: 85/100, score: 210.0, e: 0\n",
            "episode: 86/100, score: 210.0, e: 0\n",
            "episode: 87/100, score: 210.0, e: 0\n",
            "episode: 88/100, score: 210.0, e: 0\n",
            "episode: 89/100, score: 210.0, e: 0\n",
            "episode: 90/100, score: 210.0, e: 0\n",
            "episode: 91/100, score: 210.0, e: 0\n",
            "episode: 92/100, score: 210.0, e: 0\n",
            "episode: 93/100, score: 210.0, e: 0\n",
            "episode: 94/100, score: 210.0, e: 0\n",
            "episode: 95/100, score: 210.0, e: 0\n",
            "episode: 96/100, score: 210.0, e: 0\n",
            "episode: 97/100, score: 210.0, e: 0\n",
            "episode: 98/100, score: 210.0, e: 0\n",
            "episode: 99/100, score: 210.0, e: 0\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "R9RA93KWl7y_",
        "colab_type": "text"
      },
      "source": [
        "**Results**  \n",
        "Here is a graph of the results. If everything was done correctly you should see the rewards over the red line.  \n",
        "\n",
        "Black: This is the 100 episode rolling average  \n",
        "Red: This is the \"solved\" line at 195  \n",
        "Blue: This is the reward for each episode  \n",
        "Green: This is the value of epsilon scaled by 200  \n",
        "Yellow: This is where the tests started."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "H_0FrVCWl7zA",
        "colab_type": "code",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 265
        },
        "outputId": "1fc0dc81-4fd1-4ac2-b6f8-74cdc7810e02"
      },
      "source": [
        "#Plotting\n",
        "rolling_average = np.convolve(rewards, np.ones(100)/100)\n",
        "\n",
        "plt.plot(rewards)\n",
        "plt.plot(rolling_average, color='black')\n",
        "plt.axhline(y=195, color='r', linestyle='-') #Solved Line\n",
        "#Scale Epsilon (0.001 - 1.0) to match reward (0 - 200) range\n",
        "eps_graph = [200*x for x in epsilons]\n",
        "plt.plot(eps_graph, color='g', linestyle='-')\n",
        "#Plot the line where TESTING begins\n",
        "plt.axvline(x=TRAIN_END, color='y', linestyle='-')\n",
        "plt.xlim( (0,200) )\n",
        "plt.ylim( (0,220) )\n",
        "plt.show()"
      ],
      "execution_count": 12,
      "outputs": [
        {
          "output_type": "display_data",
          "data": {
            "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYAAAAD4CAYAAADlwTGnAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nOydd3yT5fr/33eSppNOSimlBWQKyBbBzQERlCHiYagMlQOc48atPwdf90BED6CoIB4UAVkyFBEEREUposgQyyiUUqB7pW2a5P79kdG0tLR0PUl7v1+vvJo+eZJcaJ/781zjvi4hpUShUCgUjQ+d1gYoFAqFQhuUACgUCkUjRQmAQqFQNFKUACgUCkUjRQmAQqFQNFIMWhsA0LRpU9m6dWutzVA0IkymwwAEBHTU2BKFovrs2bMnTUoZWd33e4QAtG7dmvj4eK3NUDQi9u69HoCePbdpaodCUROEECdq8n4VAlIoFIpGihIAhUKhaKQoAVAoFIpGihIAhUKhaKQoAVAoFIpGihIAhUKhaKQoAVAoFIpGikfsA1AoqsPqvacY3Lk5gb4X/jNesusE53IKSx1rbzAB8P23h+vMPoWiPKyWYo7u283xg78hrTZNbVECoPBKkjJMPLzsD94ZK7ilZ0yF553LLeT/rdkPgBAlx5+4vACA93YfqVM7FY0bKSXF545TkPg7xWknMaedpDg9CWkucJwhLvj+ukYJgMIrKbLY75wKi60XPK/QbD9v1j+7M7p3S9fxvXvfBOD4rTfXkYWKxoiUkmPHjmE2m9m8eTPz5s3jxGG7l9m8eXO6dOlC5843MnDgQAYPHoy/v3+Nvk+ImgmIEgCFV2K12SfZFVfiQhdZ7ALh66PSXYq65cSJE9x9991s3brVdax///588MEHjBgxgubNm2toXfkoAVB4JU4BMFsvPNLU6Sn4GvR1bpOicSKl5JNPPuHBBx9ESsnrr79ObGwsHTt2pFevXlqbd0GUACi8kov2AAzKA1DUPikpKUydOpX169dz3XXXsWjRItq0aaO1WVVGXRUKr8Risy/8xZZKBKDY6QGoP3VF7bJu3Tq6du3Kd999xzvvvMPWrVu9avEHJQAKL8Umq+oBOATAR4WAFLVDYWEhzz77LCNGjKB169bs3buXBx98EJ3O+5bTSi0WQsQKIb4XQhwUQhwQQjzoOB4uhNgshEhw/AxzHBdCiHeFEEeEEPuEEJ4dBFN4JRZrVXMAKgSkqB2KioqYN28e7dq146WXXmLy5Mns3LmTTp06aW1atanKVWEBHpFSdgb6AfcKIToDTwJbpJTtgS2O3wGGAu0dj6nA/Fq3WtHosV6sB6AEQFED1q9fT/v27bn33ntp06YNW7duZdGiRTUu49SaSq8KKWWKlPI3x/Nc4BAQA4wEFjtOWwzc4ng+EvhU2tkFhAohoi/0HcezjlNkKarmP0HRGKlyErhYhYAU1cdqtfL8888zfPhwwsLC+Pbbb9mxYwcDBgzQ2rRa4aKqgIQQrYGewC9AlJQyxfHSGSDK8TwGSHJ72ynHsRS3YwghpmL3ECAa0kxpxARXvKNToXBHVQEp6ppTp05x5513sn37diZPnsy8efO8/o6/LFW+KoQQQcBK4CEpZY77a1JKCVw4GFsGKeUCKWUfKWUfgFRT6sW8XdHIce0DsFR1H4ASAEXVWbNmDd27dyc+Pp7FixezcOHCBrf4QxUFQAjhg33x/0xKucpx+KwztOP4ec5xPBmIdXt7S8exC5KarwRAUXUsVfYA1EYwRdVJS0vjrrvuYtSoUbRu3ZrffvuNiRMn1rjlgqdSlSogAXwMHJJSvu320lfAJMfzScBat+MTHdVA/YBst1BRhSgPQHEx2BwC4NwPUBFFxVaEAB99w7yAFbXH9u3bufTSS1myZAlPPvkkP/30Ex06dNDarDpFSHlhF1oIcTXwA/An4LzansaeB1gOxAEngDFSygyHYPwXGAKYgLuklPEX/I4WQs4Z0JYHklte6DSFwkVanpkj53IJCzDSsXmTCs87kWHibHYhfduElzq+d/LvAPT8pEed2qnwfCT2Hb0JCQn4+/nRpUsXAgMDtTarSojt2/c4w+jVodIksJRyJxX3LB1YzvkSuPdiDUk1Fl/sWxSNGvuNS2Xd1KWUDdZ9V9ScrOxsEhISyM/PJzQkhC5du+JjaDwdcjziX2rQGUgdPRSGva+1KQovYfueUzyy4g+ubBvB5//qV+F576/cx9a/zvHrM4NKv7D3evvPbdvqzEaF52K1Wpk5cyYvv/wyrVq14uWXX+basWO9bzdvQ2gHbdAbVA5AcVFUvQzUplpBK0pRXFzMpEmTWLp0KZMmTeK9996jSZOKw4gNGY+4Mgw6g6oCUlwUzp3AVWkFoSqAFE4KCgoYNWoUS5cu5dVXX+WTTz5ptIs/eIoHoDOQZkrT2gyFF+EqA61CN1C1B0ABkJuby4gRI9i+fTvz589n+vTpWpukOR4hAD46HxUCUlwUtosJASkBaPQkJSUxbNgwDhw4wJIlS7j99tu1Nskj8AgBMOgMnDGdwWqzotcpd11ROVXfCKZCQI2d3bt3M2LECEwmExs2bODGG2/U2iSPwSNujQw6AxJJRkGG1qYovIQSD6DyVhAqCdx4+fLLL7nuuuvw8/Pjp59+Uot/GTziyjDo7I6IygMoqorFNRP4fA9gf3I2j674A5tNqhxAI0VKyauvvso///lPevTowS+//EKXLl20Nsvj8IgrwykAKg+gqCpW50jIcgTgyz2n+HLPKdLzzSoE1AgpKirirrvu4umnn+b2229n69atNGvWTGuzPBKPEAAfnQ+gGsIpqo5z3S+vCuj3pCwAcguLVRK4kZGSksLgwYNZvHgxM2fOZMmSJfj5+WltlsfiGUlgvfIAFBdHiQdQOgdQZLFy8LS9W3lekUXlABoRa9eu5Z577iE/P5/PP/+c8ePHa22Sx+MRV4YrBKQ8AEUVKdkIZsO9oeFfKbmuvEBeoYWiYhUCagwsWLDA1cJ57969avGvIh4hAAJBsG+wSgIrqowzCVz2+R+nslzPcwotKgTUCJg9ezbTpk3jpptu4ocffvDqIe31jcdcGZEBkSoEpKgyVrfQj3si+PekLPQ6e4Os7AIzFpvET80DbpBIKXnppZeYMWMGt912G6tWrWqQU7vqEs8RgMBIzuWfq/xEhYKSEBBAsdtYyN+TsugZGwrYZwaAGgfZEJFS8vTTT/Pss88yYcIEli5ditFo1Nosr8NjroyWwS05lXNKazMUXoLVLezjjPnnFBZzLDWfK9s1BSBdCUCDRErJww8/zGuvvca0adP45JNPMDSiHv61SVVGQi4UQpwTQux3O7ZMCPG745EohPjdcby1EKLA7bUqN/hvFdKKE9knqGxCmUIBpQXAORbybHYhAO2aBeHnoyMtrwgAXxUCalA899xzzJkzhwcffJD58+d7Xw9/D6IqsvkJ9hGPnzoPSCnHOp8LIWYB2W7nH5VSXvScvdahrSm0FHIu/xxRQVEX+3ZFI8NdAJwhoEyTfapcWIAPQb4+pOc7BEB5AA2Gd955h5deeol77rmH2bNnq2lvNaTSK0NKuQMot0mPY/7vGGBpTQ1pFdIKgBPZJ2r6UYpGQHkhoEyTPeQTFmAk2M9AWq4zBKQ8gIbAp59+ysMPP8zo0aP54IMP1OJfC9T01uga4KyUMsHtWBshxF4hxHYhxDUVvVEIMVUIES+EiE9NTaVVqF0AErMSa2iSojFQygNwCEC2wwMIDfAhyM+gPIAGxPLly7n77rsZNGgQn332GXq9EvXaoKZXxnhK3/2nAHFSyp7ADOBzIURweW+UUi6QUvaRUvaJjIws8QCylAegqBxLOQLg9ABCA4w08TOQke/wANROYK9m8eLFjB8/niuvvJLVq1fj6+urtUkNhmpfGUIIA3ArsMx5TEpZJKVMdzzfAxwFOlTl80L8Qgj1C1UegKJKlCoDdQlAMT56QaBRT5CvAadGqBCQ97Jw4ULuuusuBgwYwNdff01QUJDWJjUoanJrNAj4S0rpqt0UQkQKIfSO55cA7YFjVf3A1qGtVQ5AUSXcN4KZHUngLJOZ0AAjQgiCfH1cr6sQkHeycOFCpkyZwg033MC6desIDAzU2qQGR1XKQJcCPwMdhRCnhBD3OF4ax/nJ32uBfY6y0C+B6VLKKk95aRXSSnkAiipRvgdgJizAvvA38SspcFMhIO/DufgPHjyYNWvWqB2+dUSlZaBSynK7KkkpJ5dzbCWwsrrGtAppxZbjW5BSqgy/4oKUlwTONBUTGmDfDVpKAFQIyKv4+OOPmTJlCjfeeCNr1qxR7ZzrEI+6NWod2po8cx6ZhZlam6LwcCw26QrtOAUgqyIPQIWAvAKbzcbcuXOZMmUKQ4YMUYt/PeBRV4YqBVVUFZtbkzeztWQjWJjDA1A5AO/im2++oVOnTtx3330MGTKE1atXq8W/HvCoK6N1aGtAlYIqKsdis+HvEIBii30mQJbJTEi5OQAVAvJkfv31V0aNGoWPjw9ffPEF69atU4t/PeFRHZTahLYB4HD6YY0tUXg6Nhv4+ZSEgExmK8VWWeIBqBCQV3DixAlGjBhBdHQ027ZtIzIyUmuTGhUeJQBh/mF0iOjAT0k/aW2KwsOx2GyuEFCx1ebWBsLuAQQ7BEAnwKBTBQWeSE5ODsOHD6ewsJCtW7eqxV8DPO7W6OrYq/kx6Uds8vxh3wqFE6sEf2NJDiDL1QaidA7A16BXFWUeSFFREWPHjuXgwYN8+eWXdO7cWWuTGiUeJwBXxV1FRkEGh9NUGEhRMVabDT9DeR5A6RCQ2gPgeRQWFjJ69Gi++eYb3n//fQYNGqS1SY0Wj7s6ro67GoCdJ3dqbInCk7FYpcsDKLbYSrWChpIksIr/exYWi4UxY8awYcMGPvjgA6ZMmaK1SY0aj7s62oe3JzIgkh+TftTaFIUHY5MSo74kCZzl1ggOINDoFABVAeQpSCl58MEHWbduHXPnzmXq1Klam9To8TgBEEJwVdxVSgAUF8Rik+j1AqNeh9kqycy3ewAh/nYPQK8TBPkalAfgQcyaNYt58+bx2GOP8Z///EdrcxR4oAAAXBV7FUcyjpBmStPaFIWHYrNJ9ELgoxeuHECQrwGj24If5GtQOQAPYcWKFTz22GOMGTOG1157TWtzFA488uroGNERgGOZVW4kqmhkWGwSg07gY9BhsdrILigmNMCn1DlN/AwqBOQB/Pjjj0yYMIGrrrqKxYsXqxm+HoRH/p9w7ghWLSEUFWG1SfQ6gY8zBGQyuyqAnDQN8nWFhBTakJCQwMiRI4mLi2Pt2rVqh6+H4VEbwZw4ewKplhCKinAKgFGvs4eA8s3neQBv3NYNndoEphmpqakMHToUIQQbN24kIiJCa5MUZfBIAQj2DSbcP1x5AIoKKfEA7DmAtDwzbZuVnhYVGx6gkXWKgoICRo4cSXJyMlu3bqVdu3Zam6Qoh6oMhFkohDgnhNjvduwFIUSyEOJ3x+Mmt9eeEkIcEUIcFkLcWF3DWoW0IjE7sbpvVzRwrNItBGSxkZpbRGQTNSvWE7DZbEyYMIFdu3axZMkS+vfvr7VJigqoSg7gE2BIOcdnSyl7OB4bAYQQnbFPCuvieM8854jIi6V1aGvlASgqxGotEYD0PDNmq43IICUAnsDjjz/OypUreeuttxg9erTW5iguQKUCIKXcAVR1rONI4AvHcPjjwBGgb3UMcwqAdBv9p1A4scqSKqDkrAIA5QF4AHPnzmXWrFncd999PPzww1qbo6iEmlQB3SeE2OcIEYU5jsUASW7nnHIcu2hah7bGVGwivSC9BiYqGioWm0SnExj1gjM5hYASAK1Zt24dDzzwACNGjOCdd95RTfi8gOoKwHygLdADSAFmXewHCCGmCiHihRDxqamp572uSkEVF8Lq3Aeg17nmAzdTAqAZ8fHxjBs3jl69evH555+j16v9F95AtQRASnlWSmmVUtqADykJ8yQDsW6ntnQcK+8zFkgp+0gp+5TXB7xViBoPqSgfKaW9CkjYBcBJZJCqMdeCxMREhg0bRrNmzVi/fj2BgYFam6SoItUSACFEtNuvowBnhdBXwDghhK8Qog3QHvi1Ot+h5gMrKsJxw49ep3MJgFGvI9jfI6uaGzQZGRkMGTKEoqIiNm7cSFRUlNYmKS6CSq8YIcRS4HqgqRDiFPA8cL0QogcggURgGoCU8oAQYjlwELAA90oprdUxLNQvlFC/UCUAivNwhnz0OjAa7HHmyCa+KuZczxQWFjJy5EiOHz/O5s2bufTSS7U2SXGRVCoAUsrx5Rz++ALnvwy8XBOjnKhSUEV5lAhAiQfQVMX/6xWbzcbEiRPZuXMnX3zxBddee63WJimqgUf2AnLSIaIDB1IPaG2GwsOwOkqDDTqBwdFYTCWA65fHHnuMFStW8NZbbzF27FitzVFUE48WgD7RfUjMSiQ1//wqIUXjxWq1C4BOJ0qFgBT1w5w5c3j77be5//77mTFjhtbmKGqARwvA5TGXAxB/Ol5jSxSehMVmA3CVgQJqF3A9sXLlSh5++GFGjRrF7NmzVd7Fy/FoAegd3RuBYPfp3VqbovAgnCEgnbsAKA+gzvnxxx+544476NevH5999pmq9W8AeLQANPFtQqemnZQAKErhTAIblADUGwkJCYwYMYK4uDi++uor/P39tTZJUQt4tACAPQy0O3m36gmkcFFSBWRvBQFKAOqSjIwMbr75ZoQQfP311zRt2lRrkxS1hOcLQIvLOZt/llM5p7Q2ReEhuARAqBxAXWM2mxk9ejQnTpxgzZo1tG3bVmuTFLWIVwgAoMJAChcWZwhIL4gJ8ycswIdmwUoAahspJdOnT2fbtm0sXLiQq6++WmuTFLWMxwtA9+bdMegM7E5WAqCwY3MIgE4IbukRw89PDVTD3+uAN954g0WLFvHss89yxx13aG2Oog7weAHwM/jRLaqb8gBqiWKrjZ+PeneLbYtbElinE/j5qMW/tlm1ahVPPvkk48aNY+bMmVqbo6gjPF4AwB4Gij8dj03atDbF6/nu4FnGf7jLNUTFG3HmANTA97ohPj6eO++8k379+rFo0SJV69+A8RoByC7K5kjGEa1N8XryiiwAmBw/vRH3MlBF7ZKUlMTw4cOJiopizZo1+PmpFtsNGe8QAMeOYJUHqDk2RzmtM4zijTg3gumVANQqubm5DB8+HJPJxPr161Vr50aAVwhA58jO+Bv8VR6gFnAu/BarZwqAlJJNB85QbK043Oe+D0BRO1gsFsaNG8f+/ftZsWIFXbp00dokRT3gFQJg0BnoFd1LCUAt4KygcfbT8TSOnMtj2v/28PX+MxWe4xQvJQC1x4wZM9i4cSNz585l8ODBWpujqCe8QgDAngfYm7IXi817Y9eegMsD8NAQUK4jN3E8Nb/Cc5xhLL1KTtYK7733Hu+99x6PPPII06ZN09ocRT1SqQAIIRYKIc4JIfa7HXtTCPGXEGKfEGK1ECLUcby1EKJACPG74/F+bRnaM7onBZYCjmYcra2PbJRYPTwEVFhsHyCXmF6xALhvBFPUjA0bNvDQQw8xcuRIXn/9da3NUdQzVfEAPgGGlDm2GegqpewG/A085fbaUSllD8djeu2YaZ8OBpCUk1RbH9kosXp4CKjIYrfrQgLgvhFMUX1+//13xo4dS8+ePVV3z0ZKpQIgpdwBZJQ59q2U0hmL2QW0rAPbShEbHAvAyeyTdf1VDRqrh1cBFRXbBeBEuqnCc0o2gnlNBNPjSE5OZtiwYYSFhfHVV18RGBiotUkKDaiNK+hu4Gu339sIIfYKIbYLIa6p6E1CiKlCiHghRHxqauUTv2KCYxAIJQA1xDlNy1NDQEUWewgoI99MdkFxuedYHd6LSgJXj7y8PIYPH052djYbNmygRYsWWpuk0IgaCYAQ4hnAAnzmOJQCxEkpewIzgM+FEMHlvVdKuUBK2UdK2ScyMrLS7zLqjUQ3iVYCUEOcHoDVw0NAACcr8AKcFaJKAC4eq9XKHXfcwR9//MGyZcvo1q2b1iYpNKTaAiCEmAwMA+6Qjmb9UsoiKWW64/ke4CjQoRbsBCAuJE4JQA1x5gCKPdUDcCSBoeI8gEV5ANXmscce46uvvmLOnDncdNNNWpuj0JhqCYAQYgjwODBCSmlyOx4phNA7nl8CtAeO1YahYBcAlQSuGU4BsNZzDsBqk3z6cyJmy4U9D3cPIDGtfAGwqZ3A1WL+/PnMnj2bBx54gPvuu09rcxQeQFXKQJcCPwMdhRCnhBD3AP8FmgCby5R7XgvsE0L8DnwJTJdSZpT7wdUgLtjuAajpYNWnxAOo3xDQbyczeW7tAX4+duFOpE4BCAvwIbGCEJAzf6F6AVWdb775hvvvv59hw4bx9ttva22OwkMwVHaClHJ8OYc/ruDclcDKmhpVEXEhcRRaCkkzpREZWHneQHE+WnkAuYX2hG6B2XrB84qKrQgB7aOacKKCEJDyAC6O/fv3M2bMGC677DKWLl2qyj0VLryqji42RJWC1hStdgLnFtqrhs2VeB5FFhu+Bh1tIgIr9gBUL6Aqc+bMGW6++WaaNGnCunXrCAoK0tokhQfhVQIQFxIHKAGoCa5uoPUcAsovst/5uyd5y8MuAHqah/iRlldUrqeimsFVjYKCAm655RbS0tL46quvaNmyzrfrKLyMSkNAnoQSgJqjlQeQX1Q1D6Cw2IqvQUeA0R6mKCi2EuRb+s/UfSi8onxsNhuTJk3i119/ZdWqVfTu3VtrkxQeiFd5ABH+Efgb/JUA1ACbRgLgHERTlSogXx83ASgnZ+ASANULqEKee+45VqxYwRtvvMEtt9yitTkKD8WrBEAIYd8LkKMEoLpYNEoCOz2AokoFwIqvQe+a81tYTshIeQAXZvHixbz88stMmTKFRx55RGtzFB6MVwkAQKvQVmo0ZA2waVQGmm+uogdQbE8C+7uFgMqiksAVs2PHDv71r3/xj3/8g3nz5ql5vooL4nUCcFXsVfxx5g/STGlam+KVaOUB5DmTwJbKk8B+PnpXCMh0gRCQ2gdQmoSEBEaNGkXbtm358ssv8fHx0dokhYfjdQJwY9sbkUg2H92stSleibMXUH23gsivcg7AngR2hoAumANQAuAiIyODYcOGIYRg/fr1hIWFaW2SwgvwOgHo06IP4f7hbDq6SWtTvBJnN9D6bgaXV8UcQKEzBFRJDkAnUOENB2azmdGjR5OYmMiaNWto27at1iYpvASvKgMF0Ov0DLpkEJuObkJKqRaBi0QrDyCv8GI8AD0BRvufZrkhICnV3b8DKSX//ve/2bZtG0uWLOHqq6/W2iSFF+F1HgDAkLZDOJN3hn1n92ltitehVSsIZxK48iogexmo0wMoLwlstSkBcPLGG2+wcOFCnnvuOe644w6tzVF4GV4pAIPbDgZgQ8IGjS3xPrQaCVnlHECxDT+DHj+j/U+zIgFQ08Bg5cqVPPnkk4wfP54XXnhBa3MUXohXXkUxwTH0b9mfz//8XHUGvUi0GgpfkgOorArI6tgIZg8BFZgt553jzAE0Znbv3s2ECRPo378/CxcuVKFQRbXwSgEAuLPbnRxIPaDCQBeJVYOdwBarjULHrF9nCOijH47x/eFz553rTAL7GRwegPl8j8Fis2HQe+2fbo05efIkI0aMICoqijVr1uDn56e1SQovxWuvojFdxmDQGfjsz88qP1nhQgsByHdL5DoFYMGOYyz9pfSObimlKwls0Osw6nUVhIBA10jveHNzcxk+fDgmk4kNGzbQrFkzrU1SeDFeKwBNA5oypN0QPv/zc2zSM+fbeiJWDbqBOuP/UJIDKCi2kpJdWOo8i01ik+DruPv389FVUAZqa5SbwKxWK+PHj+fAgQOsWLGCzp07a22SwsupkgAIIRYKIc4JIfa7HQsXQmwWQiQ4foY5jgshxLtCiCNCiH1CiF51ZfyEbhNIzk1m0xG1J6CqaNEN1F0AnB5AgdlKSnZBqfOcr/n62P8sA4wGTG45APtUsf1YGmkV0COPPMKGDRv473//y+DBg7U2R9EAqKoH8AkwpMyxJ4EtUsr2wBbH7wBDsc8Cbg9MBebX3MzyuaXTLUQFRjF399y6+ooGh02DMtBchwAEGvWYLVaKrTYsNklanrlUUtg5K8C5C9jfqKeguMRT+XLPKT79+QTncooanQDMnTuXOXPm8PDDDzN9+nStzVE0EKokAFLKHUDZ2b4jgcWO54uBW9yOfyrt7AJChRDRtWFsWYx6I1N7T2VjwkaOZdba7PkGjUWDZnBODyA8yIjZaisV1z/jFgZyeQCuEJC+VCuIg6dzAEhMz29UIaBvvvmGBx54gOHDh/Pmm29qbY6iAVGTHECUlDLF8fwMEOV4HgMkuZ13ynGsFEKIqUKIeCFEfGpqarWNmNZ7GjqhY/7uOnM0GhRaeAAuAQj0pajYVmpRP51VIgDOeL+vwe4BBBj1FBTb32u1Sf46YxeA5KwCdI1EAJzzfLt168bnn3+u5vkqapVaSQJLezH+Ra0oUsoFUso+Uso+kZHVH/AeExzD8I7DWfLnEpUMrgLODWD1uQ/A2Qk0PMDH7gG4CYB7HqCsB+Dv5gEcT8t3lZJK2Tg6gZ49e5Zhw4YRFBSk5vkq6oSaCMBZZ2jH8dNZ1J0MxLqd19JxrM4Y1WkUZ/LOsDdlb11+TYPAeeNfnzuBz/MAit0FoJwQkI9bCMix6B84nQ2As/qzoZeBFhQUMHLkSFJTU1m3bp2a56uoE2oiAF8BkxzPJwFr3Y5PdFQD9QOy3UJFdcLQdkMRCNUaogq4PIB6DAHluQTA57wcwOksNw+gnBCQMyx0MCUHo17HZTEhABga8DhIm83G5MmT+fXXX/nss8/UPF9FnVHVMtClwM9ARyHEKSHEPcBrwA1CiARgkON3gI3AMeAI8CHwn1q3ugyRgZH0jemrBKAKOG/86zMElF9kwaATNPHzwWqTrs6gUH4S2M+nJATkLAM9eDqH9lFBtGkaCDTsWQDPP/88y5cv5/XXX1fzfBV1SpXaQUspx1fw0sByzpXAvTUxqjoM6zCMZ79/lrN5Z4kKirYe++UAACAASURBVKr8DY2UEg+gfkNAQX4GV2w/q6AYgMgmvpwutwrIrQzUbEVKycHTOQy8tBlRwfa2Bw11HvCnn37KSy+9xJQpU3j00Ue1NkfRwPHancBlubn9zQBsTNiosSWejbP6sz5DQLlFFgKNBowOAcg2mQFoGxlYKglcUgXk8ACMegqLbaTmFpGeb+bS6GBiwwKAhukB7NixgylTpqh5vop6o8EIQI/mPYgLiWPloZVam+LRWGuhCmjlnlM8t3Z/5Sc6yC+yEORrcN3ZZ5nsHsAlkUFkmYpdlT7neQA+esxWG4npJgBaNw0kNrxhCsCRI0cYNWoUl1xyiZrnq6g3GowACCEY03kM3x79lsyCTK3N8VhqYyDM9r9TWbO36oVd+UVWAn31Lg/AGQK6xBHPP+3wApy7gn3dcgBg3/gFENXEj9hwf6BhCUBGRgY333wzQgg2bNig5vkq6o0GIwAAY7uOpdhWzOq/VmttisdirYWdwCazhZxCS5UbyuUVWQj0LckBZDsEoG0ze117cqZDAIrL7AMw2gXghEMAmgX7Eh3ij0EnGowAmM1mbrvtNjXPV6EJDUoAekf35pKwS1h2YJnWpngszm6gNfEAnGWdOYXnD2spD2cIyOUBOEJAPWND0esEuxPtXUZKqoBKQkAAiekmDDpBeIARvU4QE+bfICaCOef5fv/993z88cdqnq+i3vH+q8gNIQRju4xly7EtnMk7o7U5HknteAD2UE2mI5lbGfnneQBm/Hx0hAYY6Rkbyva/7a1AnCEgo/58DyCyia+r/cMLw7vw7+svqbb9nsKbb77pmud75513am2OohHSoAQAYHKPydikjdk/z9baFI+kNnIAzp29WVUVALOVQKO+lAfgHPl4XYdI9p3KJi2viMJiG0a9zrXQuwQgzUSz4JKpVwM6NaN3q/Bq2+8JrFq1iieeeIJx48apeb4KzWhwAtAhogPjuo5j7u65pJnStDbHo5BSurWCqIkAODyA/OIqnV9gthJQJgfgDO9c19HeB+qHhFTHNLCSP0nnOblFFpo18a22vZ5GfHw8d955J/3792fRokWq3FOhGQ1OAACeueYZTMUm3tn1jtameBTud/01EgDH7tyqhICKrTbMVhsBPvqSMtCCYtdu364tQogINLL9cCpFFpurAghKBAAgKrhhCEBSUhLDhw9X83wVHkGDFIAuzbowuvNo/vvrf8ktytXaHI/Buegb9TqsNomUFy8CUkq3EFDlHoAzXxDglgQ2W2yuEJBOJ7i2QyQ7EtIoNFtdIgH2XkBOopp4/0KZm5vLsGHDMJlMrF+/Xs3zVWhOgxQAgMevfJzsomw++u0jrU3xGGyOBd8ZZqmOF1BksbnCSFkFlXsAzl4+AUZ9ueEdgP5tI8jIN3PgdE4pD8DP7ZxmXu4BlJ3n26VLF61NUigargBcHnM517a6lnd+eYdia9Vi1Q0d54LvXGSrkwh2n++beTEegFsSGMDP7e7+ijb2hO7hs7mlPAB/o7sAeLcH4Jzn+95776l5vgqPocEKAMBjVz7GyeyTrDi4QmtTPALnNDDnIludUlBnAhiqVgVU4BIAQ+nwjtvdfVx4gCvG7+4luIeAvDkJPG/ePNc833//+99am6NQuGjQAnBT+5toF96OD/Z8oLUpHoHLAzDUwAMwu3kAVagCcnoMZT0A97t7IQR920SUsg3Az+CeBPZOD0DN81V4Mg1aAHRCx5SeU9hxYgd/p/+ttTma4/QAnAtxcTUawjkXdD8fXZWqgEzFJSGgUou7T+nZtn0dYSBft+M6ncDXoHPtAvY2nPN8L7vsMjXPV+GRNGgBAJjUYxIGnYGPf/tYa1M0pyQHYF+IqucB2Bf0mFD/qlUBFZWEgAw64Rrp6B7egZI8gLtIgN1TcN8F7C2oeb4Kb6DaAiCE6CiE+N3tkSOEeEgI8YIQItnt+E21afDF0jyoOcM7DOeTPz7BbK3aztWGirVMCKg6OQCTwwOICQsg02TmUEoOlz2/icS0/PLPd6sCEkKUGvjuTrvIICICjQT5lp5RFOCj97r4v5rnq/AWqi0AUsrDUsoeUsoeQG/ABDjbcM52vial1HxCy7Te0ziXf67RewFlBaA6HoCzEVzLMH+KLDa+PXCW3CIL+x1D28tS4BYCgvP7/DjR6QQfT76chwa1L3U82N+HFqH+F22nVthsNiZNmqTm+Sq8giqNhKwCA4GjUsoTnritfXDbwVzX6jqe2/Yc4y8bT6hfqNYmaYJVlq4Cqs5YSJNbCAhg29/ngJKWzmXJdwsBgSP8VGg5zwMA6BF7/v+Xt/7ZnWA/7xmO8uijj7JixQreeustNc9X4fHUVg5gHLDU7ff7hBD7hBALhRDlTrcQQkwVQsQLIeJTU1NryYzyEULw9o1vk25K5+UdL9fpd3ky1jL7AKqzEcxZBdQyzC4AvydlAZCcVb4AFJgtCFEy6L0iD6AiusaEEBcRcNF2asHs2bOZPXs2Dz74IDNmzNDaHIWiUmosAEIIIzACcBbbzwfaAj2AFGBWee+TUi6QUvaRUvaJjIysqRmV0iu6FxO7T+TdX98lKTupzr/PEykbAqrOWMj8IgsGnaCZozWDs5tEhR6A2UqAj97V8KzstK+GwvLly5kxYwajR49m1qxZqsGbwiuoDQ9gKPCblPIsgJTyrJTSKqW0AR8CfWvhO2qFmdfPRErJizte1NoUTbDayoaAqiMAVgKMesICS8IyEYHGCj0Ak9mKv7Ek0nixHoA3sHPnTiZOnMjVV1/NkiVLVLmnwmuoDQEYj1v4RwgR7fbaKKDq08PrmFahrZjeZzoL9y4kIT1Ba3PqnfM9gOrsBLYPdwlz1OXrdYJBl0ZV6AGYzBYCfUsWRN8y0768nYSEBEaOHElcXJzq7qnwOmokAEKIQOAGYJXb4TeEEH8KIfYBA4CHa/Idtc3T1zyNn8GPezfei01WfyqWN1J2J3B1PACT2Uqgr4HQALsH0DGqCW2bBZJbZHHN+i17vvti79uAPID09HRuuukmdDodGzduJCIiQmuTFIqLokYCIKXMl1JGSCmz3Y5NkFJeJqXsJqUcIaVMqbmZtUfzoOa8ecObbD62mbm/ztXanHqlbDfQ6raCCDTae/uHBfjQu1UYMaH2JG15XkCBQzCcNJQcQGFhIbfccgtJSUmsXbuWdu3aaW2SQnHRNPidwOUxvc90bmp/E49/93ijahHhTPo6wzDVawZncS3oS6f249HBHYlxVASVlwfIN1tK7fptCDkAm83G3Xffzc6dO/n000+58sortTZJoagWjVIAhBB8NPwjfPW+/GfDf6o1GMUbqRUPoMjqqunv1DyYkAAf156A5EzTeecXlA0BNQAP4LnnnmPp0qW8+uqrjBkzRmtzFIpq0ygFACC6STQv/+Nlthzfwhf7v9DanHqhbA6gOs3gyiZ1AZoGGfE16Cr0ANxDQE4PoGwvIG9h4cKFvPzyy0yZMoUnnnhCa3MUihrRaAUA7KGg3tG9mfHtDLILy29l0JAoOw+geq0gSsf0we5RxYT6lysABWZrqXCP87vLdgP1Br777jumTZvG4MGDmTdvnqr1V3g9jVoA9Do97w97n7N5Z3n2+2e1NqfOKTsRrHqtIOxJ4LLEhPmXmwTOL7KWOt9o0CHE+V0/PZ3ffvuNUaNG0alTJ5YvX46Pj/e0p1AoKsK7rsI6oE+LPvy7z7+Zu3suv6X8prU5dUpNdwLbbNJVBlqWlmEBHE/LL+VV2GySguLSG8EigoxEBvl61d1zQkICQ4YMISIigm+++YaQkBCtTVIoaoVGLwAALw98maYBTXnwmwcbdEL4/J3AF+cBOIe7BBrPF4Ar2oSTU2hhf3JJKK3QUroTKMDUay9h9b1XXZzhGnL69GkGDx6MlJJNmzYRExOjtUkKRa2hBAAI9Qvl/67/P3ae3Mnqv1ZX/gYvxSprthHMOQsgwPf8ENA17ZsiBGz/u6Sxn7MTqHsIKMBocFUNeTq5ubkMHTqU1NRUNm7cSMeOHbU2SaGoVZQAOLin1z10juzME9890WAHx1gdd/yuHMBFhoCcswDKDm0BiAjypVtMCNsOn3Mdcw6E9y/HY/B0bDYbEydO5MCBA6xatYrLL79ca5MUilpHCYADg87Amze8yZGMI3y27zOtzakTnPu+qtsMzjkLIKCCBf26DpH8npRFlmNWsLN1dHlJY0/n//7v/1izZg2zZs1i8ODBWpujUNQJSgDcGNpuKN2juvPWz281yFyAywOoZjM4pwdQdh+Ak+s6NsMmYeeRNKBEMLxt1++qVauYOXMmkydP5oEHHtDaHIWizlAC4IYQgkevfJSDqQf5+sjXWptT69TUA8hxNHtr4lt+CWSP2FBCA3zYcsgeBiqoxGPwRP78808mTpzIFVdcwfz5872qWkmhuFiUAJRhbJexxDSJ4bWdrzU4L6CmOYBTjjr/FqHltzzW6wSDO0ex+eBZCoutrhCQt+z6TU9PZ+TIkQQHB7Nq1SrV2lnR4FECUAYfvQ/PXPMMP5z8gQV7FmhtTq3iLAP10ds3Y1kvsgw0KdNEoFFPeKCxwnNu7taCvCILO/5OdfMAPF8ALBYLY8eOJTk5mdWrV9OiRQutTVIo6hwlAOUwrc80Bl0yiBnfzuBIxhGtzak1nCEfvRAYdILiiwwBJWUUEBsecMGwyJVtIwgL8GH9vpSSJHA5VUOexmOPPcaWLVtYsGABV1xxhdbmKBT1Qm3MBE50DID5XQgR7zgWLoTYLIRIcPwsdzC8p6ITOhaNXIRRb2T40uGkmdK0NqlWcHYD1esFBp3uonsBJWWYaBl24QHtPnodQ7pG892hs2Tk2auBPD0J/Mknn/DOO+/w0EMPMWnSJK3NUSjqjdryAAZIKXtIKfs4fn8S2CKlbA9scfzuVbQMbsmasWtIzEpk6GdDyTfna21SjXEmgZ0ewMXkAKSUJGWaiAu/sAAA3NKjBSazlUU/JQIQ4MGN33755RemTZvGwIEDefPNN7U2R6GoV+oqBDQSWOx4vhi4pY6+p065rvV1LLttGfGn43lt52tam1NjnDF/vU5g0AssNhv7TmWRW3j+KMeypOebMZmtxIZXvov3iksiuOfqNmTkmzEadBj0nhlpNJvNjBo1ipYtW7Js2TIMBs8PVSkUtUltXJkS+FYIsUcIMdVxLMptFOQZIKrsm4QQU4UQ8UKI+NTU1LIvewwjOo5gfNfxvPXzW5zMPqm1OTXC5QHoBHqdjrwiC7fN/5lFPyZW+t6kDPuwl9hKQkBOnr7pUm7oHEVLD237YLPZOHBgPzk5Oaxdu1bN81U0SmpDAK6WUvYChgL3CiGudX9R2mspz4s1SCkXSCn7SCn7REZG1oIZdcdrg+x3/09+53WRrFI4PQCdAB+9IDEtH7PVRmJa5eGtJEcJaFxE1QRArxN8cGdvNjxwTfUNriOklCQk/E1OTi7/+9//6Nq1q9YmKRSaUGMBkFImO36eA1YDfYGzQohoAMfPcxV/gucTFxLHjH4zWLp/KXtT9mptTrWxSoleJxBCoNcJjjsW/lOOQS7JWQX8dSan3Pc6PYCWYVW/o9fphEcmgN99913OnDlL69atGDVqlNbmKBSaUSMBEEIECiGaOJ8Dg4H9wFeAs5xiErC2Jt/jCTx21WOE+oXy3LbntDal2lhsdgEAe7VOpske+3cOcnlx3UHu/7x8gUvKMNE0yOhVu3rL47vvvuORRx6hadMIWrVqrbU5CoWm1NQDiAJ2CiH+AH4FNkgpvwFeA24QQiQAgxy/ezWhfqE8fuXjrP97PT8n/ay1OVWisNhaqjunzSbRO2r4nUIAcCanEIvVxt9nc0nLKyr3s05mmIitQgWQJ3P06FHGjBlDp06d6NTpUq3NUSg0p0YCIKU8JqXs7nh0kVK+7DieLqUcKKVsL6UcJKXMqB1zteX+K+4nKjCKu9beRWZBptbmVMrGP1OYvGg3px0hHotNYnAs/AY3AbDaJMlZBZzMMJFdUOyaHez++ol0U5UTwJ5Ibm4uI0eOBGDt2rXo9Z4XmlIo6hvPrM/zUIKMQSz/53KOZR7j1uW3UmQp/27ZU8hyhHiyHU3cbDaJzikAevtPf0eN/s9H07HYJDYJuY6unwBHzuUxat6PJGcV0CsutD7NrzVsNhuTJk3i0KFDLF++nLZt22ptkkLhESgBuEiubXUtH4/4mG2J2xjy2RCP9gRMjlYM+Y4FvbQHYP9f36e1fZP2Dwklu52zTSX7AmZv/pvjqfnMGdeDSVe2rg+za50XX3yR1atXM2vWLAYNGqS1OQqFx6AEoBpM6D6BJaOW8OPJH7l+8fUUWgq1NqlcnP34nX38bdLNA3D87HeJvf7d2cMfINNUMhHtaGoefduEM7JHjFe2Rl69ejUvvPACkyZN4sEHH9TaHIXCo1ACUE3u6HYHK8esZN/Zfby4/UWtzSkXpwA4Z/NarG4egCME1L5ZEBGBRleYCCDL8VxKe+y/VURgfZpda/z5559MmDCBvn378v7773ulgCkUdYkSgBowvONwJveYzBs/vcHvZ37X2pzzcIZ+nF05rVKiE6VDQLHhAcQ4avubBvkCuEY6puYWUVBspXVT70v+JiYmcvPNNxMcHMzq1atVb3+FohyUANSQWYNn0TSgKeO+HEd2YbbW5pTCVOz0ABwCYJOuO39nGWhseAAxjnYNziSv0xtITLdv/vI2DyApKYkBAwaQl5fHxo0bVW9/haIClADUkHD/cL4Y/QVHM49y+6rbsdqsWpvkwlRUOglsddsH4KMXhAcaCfI1lAhAK3tC2Fk9lJhu3yncuortHzyB5ORkBgwYQGZmJps3b6ZHjx5am6RQeCxKAGqB61pfx7tD3mVjwkZe+eEVrc1xke9KAtt/Wt12AndvGcrATs0AXCGgjlFNCDTqSwQgLR+DTrgEwtM5c+YMAwcO5Ny5c2zatInevXtrbZJC4dF4975+D2J6n+nsTNrJC9tfYECbAVwdd7XWJp1XBuouAPcPbO86r2+bcC6JDKRbyxBCA4xkFdhzACfS7bt/PbWdszupqakMHDiQU6dO8c0336ipXgpFFfD8K9tLEEIw/+b5tAltw7gvx3Es85jWJpVUAZnPFwB3urQIYesj1xMR5EuIv49rH0Biej6tvCD84wz7HD9+nPXr13P11dqLr0LhDSgBqEWCfYNZNXYVBZYCrv/keo5mHNXUHlNRmSSwLF8A3AkL9CGroNhVAtrawxPAf/31F1dddRUnT55kw4YNXH/99VqbpFB4DR4jAD8dSeOno94/e7dbVDe2TNxCfnE+Vy28ij2n92hmS74rBHR+DqAiQv2NZJnMpOebySuyeLQHsH79eq644goKCgrYtm0bAwYM0NokhcKr8BgBeP2bv3jt67+0NqNW6NG8Bzvv2omfwY9rP7mWjQkb690GKSUFZXYCu1cBVURIgA/ZBcWccFUAeaYHsGHDBkaMGEG7du3YvXs3vXr10tokhcLr8BgBOJtTxOksz2ypUB0ujbyUn+/5mU5NOzFi6Qg+3PNhvX6/2WrD4ujq6UwGW6rkAfiQZSrmUEouAG0jg+rW0Gpw+PBhbr/9dnr06MGOHTuIi4vT2iSFwivxGAFIzSsiLa+IIovn1NHXlOgm0WyfvJ3BbQczdf1UPv3j0zr7rnM5hby//Sj2CZwl8X8oCQHZqiIAAT5YbJKfj6YTFuBTpSHw9YXVauXDDz/k2muvxWg0snr1agIDPdNDUSi8AY8QAItNYnXcrZ7L8ewWyxdLkDGI1WNXM7DNQO5eezdfHf6qTr5n9d5kXvv6L5Iy7L3/nfF/fx+9KwRUNQ/ACMAPCal0jw31mP45mzZtokePHkydOpV27dqxZcsWWrVqpbVZCoVXU20BEELECiG+F0IcFEIcEEI86Dj+ghAiWQjxu+NxU2WfVWy1uZ47h5c0JHwNvqwau4qe0T0ZtWwUb/74putOvbY46ZjZm5ZvF1Bn/L9ZsC/5RRaklNiqUAUUEuADQE6hhe4tte//v2/fPoYOHcqQIUMwmUysWLGCnTt30q1bN61NUyi8npp4ABbgESllZ6AfcK8QorPjtdlSyh6OR6UZUIu1ZDFMyW44eQB3gn2D2TZpG6MvHc3j3z3ONYuuYXfy7lr7/CTHXN+MPPsmLucu4MggXyw2ac8JuHUDrYhQfx/X8x4aDoDZu3cvQ4YMoXv37uzatYtZs2Zx8OBBbrvtNo/xShQKb6faAiClTJFS/uZ4ngscAmKq81kWdw8g+8IewNa/zvLVH6er8zWaE2gMZNlty/ho+EccyThC34/6MnH1RJJzkmv82accHkC6wwNw9gGKbGLv8JlfZLXPA6hk8QwNMLqea+EBSCmZM2cO/fr1Y+/evbz00kscPXqUGTNm4OvrW+/2KBQNmVrJAQghWgM9gV8ch+4TQuwTQiwUQoRV8J6pQoh4IUR8Vm4eAIFGPSmVVAL9d+sRZm/+uzbM1gQhBPf0uoeE+xN46uqnWH5gOZfOvZR5u+dhk7bKP6AcbDbJKYcHkFbGA2jmEgCLfSKYvvIkMEBceADhgcYLnlvbpKWlMWLECB566CGGDBnCwYMHeeaZZwgPD69XOxSKxkKNBUAIEQSsBB6SUuYA84G2QA8gBZhV3vuklAuklH2klH2Mfv6OipMAUirxAE6km0jOLHAljb2VJr5NeGXgKxy89yD9Wvbj3o33MmnNJCw2S7nnr9xzivX7yvd8zuYWYnZ4URn5dgFwln46PYC8Iot9JnBl+wAcIaDusfV393/u3DmeeeYZOnTowLfffsu7777LmjVriIiIqDcbFIrGSI0EQAjhg33x/0xKuQpASnlWSmmVUtqAD4G+lX2OxSpp1sSPFqH+F9wLkFNYTHq+GbPVxtmchpEruCTsEjbduYmXBrzEkn1L+OeKf54nAkUWKy+sO8Bbmw6X+xnOyh+A9DxHCMiZAyjrAVSSA/Dz0TPu8ljG9GlZ7X9TVUlPT+epp56iTZs2vPrqqwwYMIDdu3dz//33qzi/QlEP1KQKSAAfA4eklG+7HY92O20UsL+yzyq22mgW7EvzEL8LegAnHQNKAJIyTBWe520IIXjm2meYM2QOa/5aw8j//ZsN+1Jcr//wdxq5hRYS003lCp/zv0VogA/pDg/A2f+nWRP7JKx8sxWrrWQm8IV4bXQ3rmkfWeN/V0UUFhby4osv0qZNG15//XVGjhzJoUOHWLlyparuUSjqkZq0g74KmAD8KYRwzkN8GhgvhOgBSCARmFbZB1lsDg8gxI9MUzEFZiv+Rv155x1Py3c9T8osoKE1/H3gigf4Oz2Bubv/S+I5I4M6v42vwZcNf6ag1wmsNsmvxzMY3r30hKuTGSaEgMtiQkh35AAKyvEArFXwAOoSk8nEmjVreOGFF0hISODWW29l5syZdO3aVTObFIrGTE2qgHZKKYWUspt7yaeUcoKU8jLH8RFSypTKPstitREV7Et0iH3XqdMLWPXbKT7/5aTrPGd/GiFK6t4bGnd2+n/4WXtz0DSPNnPasO7wRjYfPMstPWIIMOr59XjGee9JyjTRPNiP6BA/VxVQvtmKUa9zxfTziixV6gZaF0gpWbRoES1atOCOO+5ACMG3337LypUr1eKvUGiIR+wEltirVaJD7eGKM469APO3HeWNTX+5Er6J6Sa7UAT7ucoeGxo7E7JoZn6BZkUvEuYXwahlIzlr2cItPVvQu1VYuQJwKqOA2LAAIoJ8ycg3I6XEZLYQ4Ksn0Nfu5Dk9gPoWgMOHD3Prrbdy991306NHD77//nsOHTrEDTfcUK92KBSK8/EIAQCICvajhcMDSM4qoMBs5WhqHlmmYvadygLsHkCriEBahgc0WA9g+9+p+Oh0+Nt6smDoemIDe5Lm8xYnC77jijbhHD6bS6Yjzu/kZIZ9cldEoJFiqySn0EJ+kZVAo4FAX3sozeTIAVTWDbQ2SE5O5oMPPmDYsGF07tyZzZs389prr7Flyxauv/56dDqP+bNTKBo1HnMlNgv2IybMHz8fHQdO53D4bC7OSs/tf6cCdg+gdUQAceEBJGU2PAFIyyviz+Rshl5mz6On5/pweZM3CPfpxl1rJ1FksO8cHv/hLu79/DeyTcVkm4o5m1tIbLg/EUH2uv30vCIKii0EGPX4GvT46IU9BGST6Oto8c3NzWXBggX069ePli1bMn36dA4ePMijjz7KsWPHeOKJJ9Drz8/rKBQK7fCYmcDNmvjio9fRKy6M3YkZdIhqAkB0iB/bDqfyr2suITW3iFYRgViskrM5RRQWW/Hzqb9FxWK18crGv7iqXQQDL42q9c/f4RC6if1bse6P0ySm5fP3GTMTL/0v2zMe4NHv76BL81EE+f+bbw/kkWUyoxMCvRD8o1Mz1zD39Hwz+UVWAhyJ9ACjwS0EVHv2SinZtWsXH330EcuWLSM/P5+uXbvyyiuvMGLECDp37qzKORUKD8YjPIBAo8FVrdK3TTgHU3LYdSydJr4GxvSJ5Y9TWfyRZA8DtWkaSFyEPVTk3P1a26z9PZmZ6w6UOial5IV1B1j443HeLFOP//bmv1m+OwmAjX+m8NSqP8vdqPb25r/5cs+pCr/36/1naB7sR++4MKJD/Pj5aDpZpmJ6tWzJj3f/yONXPs6hnLX8lDeJkf2T2XkkjR8S0njl1svo1jLUzQMw23MARru+B/kaatUDyMzM5J133uGyyy7jyiuvZNmyZYwbN45du3axb98+nnrqKbp06aIWf4XCw/EID+CSyEDXnXzfNuFICV/vT6FnXBjXd4xkzpYEHlpmrzRtFRHgKnFMyjTRrpl9YInVJnlsxR+0iwrinqvb4GuonmeQnlfE/1uzn9xCC6N7taRrTAgAK/acYsmuk7RrFsRfZ3I5ci6Pds2CyMw3897WBMA+RP3DH45RbJX0jA1lzOWxrs/9F4xSxQAADmtJREFU6Wga725JwNego3/bCGJCS/fZzyksZvvhVO7s1wqdTtAqIoBfjqcD0LlFCP4+/rx+w+uM6TKGe766h1l7ptI+ug+j2z7CmD7274kItItoen4RJrOV6BB7BVCgrx5TkdVRBVSt/ywAZGRk8PHHH/PKK6+QlZVF3759WbBgAePGjaNJkybV/2CFQqEJHuEBuNMzNgwfvaDYKukcHUy3lqHccUUcXWNCGNOnJR2imhAbbp9Teyglx/W+7w6dZdXeZN745jBD3/mBtLySuQJ/JGUx7X/xnMutfPfwnC0JmMxW/Hx0LP4p0XV88U+JdI0J5n/39EUIXG0ZfjiShpTQIsSfeduOEhceQPeWIbz57eFSoxhfWn+I5sH2Kqc3vzl/9OXmA2cxW20M626P/7eOCMQm7SWvnZqXLK69W/Rm9792M//m+eRbT/PanvEM/t9gjmYcdfXusXsAVvwdHkCgr4F888V7AJmZmWzatImZM2cyePBgoqKiePzxx+nXrx/x8fH88ssv/Otf/1KLv0LhpXicAPgb9XRzdKHs0iIYvU7w8qjLWDj5ct64rTs+eh3NmvjSt3U4c75LYO/JTAA+/TmRFiF+fDypDyczTK6GcacyTdyzeDebDpytsJUC2BuqfbnnFJ/9cpLb+8YxuldL1v5xmsx8M8fT8jlwOodbesQQHeLP5a3DWb8vBSkl2w+nEhrgw6r/XMmk/q1YNLkvz4/oQmpuEU+s3EdKdgEvbzjEwZQcnr75UqZc04Y1v5/md0dIy8n6faeJCfWnp6MHTyvHLN42EYGuUk4nPnofpveZzpH7j/DWDW8Rfzqe/h/3589zewn2M5CRbya/yEKgIwcQaDRwLNW+h6KiKiCr1cqff/7Jhx9+yN13303nzp0JDw9nyJAhzJw5k5SUFB566CHi4+P5+uuv6d2798X8b1UoFB6IR4SAytK3TTh7TmTSuUVwua8LIZh/Zy9umfcj//o0nnsHtOPHI+k8PqQjAy+N4s5+rfj050R6xYUxd9sRiiw2br4smhV7TrkW72HdovmnI3SyPzmb59bu57eTWfSMC2XGDR1IzSvis19O8vHO4/ga7Dp5czf73fnwbtE8u/YA8Scy2f53Kte0jyQq2I+ZI+2bmuIiAnhoUHve23rE1dJhTJ+WDO8WzT86NWN5/CleXH+QL6f3B+C7Q+fYeSSNu65q44qbt46wezmXVvDfAMDfx59HrnyEYR2GceOSG+n3cT/89e1YfbwHGeZW+BjGAnD7FXE8svwPAFc3UKvVSnx8PBs3buTHH3/k119/JTfXPgc4IiKC/v37c8cdd9CvXz8uv/xygoMrtkOhUHgnHikAt/eNQwCdmle86EQE+bJocl+mL9nDzHUHMRp0jHUs6A8ObM+q307xyIo/iAr25cOJfbg0Opifjqbx2Jf70OsEO4+kYTTo2HUsgy92nyQi0Mibt3VjdK+W6HSCsEAjt/RowdxtRwgLMNKnVZhrp/KI7jG8v/0Ydy/aTW6Rhes7nN8356FBHRjStTkr4k8xuHMUV1xi72wZ5Gvg0cEdeGLln3yw4xi7jqWz7XAq7ZoFMbF/yYhDpwfQObryhbdj0478fM/PzPllDh//+i37s9Zg0xWy/PhOnrVu5abLoukRG8qnP58guiiJG2+8l127dpGTk4NOp6N79+5MmDCBfv360b9/f9q2basSuApFI0DU9mjC6tCnTx8ZHx9frfcWW218/stJAox61x09wJZDZzl4Ooe7r27jCqFsO3yOP5KyGdc3lsmLdnMoJQe9TjCpf2seuqE9wX4+pT67sNjK2AW7+CMpixeGd2byVW1crx1KyeG2+T+Rb7by6zMDXU3XqoLVJhn23k4OpeTQxNfAg4PaM+nK1vi4ZWitNsnszX9zZ79WNA+p+mdv2JfCo1/+xjnr12QY5zK45WBujb6VSGskfx/4m+eff57IyEiGDx/ONddcw5AhQxplv/29e68HoGfPbZraoVDUBCHEHilln2q/39sFoLqkZBfw0Q/HGXt5rGvPQXmk5haxZNcJ/nXtJQSVicXHJ2bw15lc7ux38cPJD57O4as/TnP31a0vSjycSCnJzs4mOTmZ06dPk5ycXPL8zDkOJp7mROgezNc6kuE24Bz0Te3L1wu+bpSLvjtKABQNASUADQyz2UxmZiYZGRmkp6dz+vTp8xd4x0+T6fzd0OHh4TRt2pQmTZrQq1cvuvbvSoZfBknWJDac3kDz4Ob8Nu03dMLj8v/1ihIARUOgpgLgkTmAhkBRUREZGRmuR3p6eqnfK3otLy+v3M/z8/OjRYsWxMTE0KdPH9dz58+YmBiio6Px9/cv9/0AS/YtYcLqCaw+tJrRnUfX1T9doVB4CUoALoDNZiM3N5fs7GzXXXlVH+XdnTsxGAyEh4cTHh5OREQEsbGxdO/e3XXMeTw8PJzo6GhatGhBWFhYjROz47uO56UdL/H8tufp17IfMcExNfo8hULh3TRYAZBSkp+fT2ZmJllZWa5HTk4O2dnZVXrk5uZyoRCZr69vqUW7TZs29O7du9SxsLAwIiIiXAt6eHg4QUFBmlTZ6HV6Xh34Krcuv5XY2bH8o80/mNBtAkPbD6VZYLN6t0ehUGhLnQmAEGIIMAfQAx9JKV+r7mfZbDays7M5c+YMCQkJnD592rWwuy/wZY9ZrdYLfq7RaCQkJKTUo3379q7nwcHBrufuC7jzcaFwi6cy6tJRJNyfwJJ9S/jfvv8xee1kANqHt+fK2Cvp2bwnsSGxxIXEERscS+T/b+9cQ+OowjD8vI1ZY9toUk1DE6NtJQpSUKuIgvrHa4taLyAV8YJCERS8IFItlP4TFRUEUSoWL3hDVOwf8YZYbNGqNbWtbe0lkTTGbKpgrRU1+vljzobZNLtVs5sz634PDHPm25nk5Z2z5+y5zJlpbXU/XuA4/1eqMggsqQH4BrgQ2AN8BlxrZl+Pd/68efNs+fLl9PX1MTQ0RD6fH93n83mGh4cZGRk56LpcLkdraystLS2H3Le0tBxU2Dc1/fvZN/8nzIz1A+tZ8+0a1vavZV3/OoYPDBedk2vI0dncSeeRnXQ0d9AxvYOO5o7R4+m56TROaaSxoZEGNdTM8wN7d19PbkojZ5z+cWwpjvOfyeQsIElnAyvM7OJwfB+AmT1Q4vxREVOnTqW9vZ2ZM2cWbW1tbbS1tdHd3U1XVxetra00NTXVTIFTC5gZew/spX9fP/0/9RftB/cP8t3P3zGwb4Bf/vjl0H8s4zx2SrK/f8sRNEzx9xQ4tcn++/dnchZQJ9CfOt4Dxe9wl7QEWBIOfwM2Q/Li8N7eXnp7e6skbUIcA+yNLeIf4DoPwV2jqX+0pLj7WTlqQSPUjs6TJnJxtEFgM1sJrASQ9PlEarHJwnVWFtdZWWpBZy1ohNrSOZHrqzW6NwB0pY6PDTHHcRwnI1SrAvgM6JY0R1IOWAysrtL/chzHcf4DVekCMrMRSbcD75BMA11lZlvKXLKyGjqqgOusLK6zstSCzlrQCHWiMxNrATmO4ziTjz/h4ziOU6d4BeA4jlOnRK8AJF0iabuknZKWxtZTQFKXpA8lfS1pi6Q7QnyFpAFJPWFbmAGtfZI2BT2fh9gMSe9J2hH2rRH1nZTyq0fSPkl3ZsFLSask5SVtTsXG9U4Jj4e8+pWk+ZF1PixpW9DypqSWEJ8t6deUr09F1lnyPku6L/i5XdLFkXW+mtLYJ6knxKP4WaYMqlz+NLNoG8kA8S5gLpADNgInx9SU0jYLmB/SzSRLW5wMrADuia1vjNY+4JgxsYeApSG9FHgwts7UPf8eOD4LXgLnAfOBzYfyDlgIvA0IOAv4NLLOi4DDQvrBlM7Z6fMy4Oe49zl8nzYChwNzQlnQEEvnmM8fAZbH9LNMGVSx/Bm7BXAmsNPMdpvZ78ArwKLImgAws0Ez2xDSPwNbSZ5wrhUWAc+F9HPAFRG1pDkf2GVm38YWAmBma4Afx4RLebcIeN4SPgFaJM2KpdPM3jWzwiJZn5A8bxOVEn6WYhHwipn9Zma9wE6SMqHqlNOpZH2Za4CXJ0NLKcqUQRXLn7ErgPGWjMhcIStpNnAa8GkI3R6aWKtidq2kMOBdSV8oWWIDoN3MBkP6e6A9jrSDWEzxFytrXkJp77KcX28m+fVXYI6kLyV9JOncWKJSjHefs+rnucCQme1IxaL6OaYMqlj+jF0BZB5J04HXgTvNbB/wJHACcCowSNJUjM05ZjYfWADcJum89IeWtA+jz/dV8lDg5cBrIZRFL4vIinflkLQMGAFeDKFB4DgzOw24G3hJ0pGx9FED93kM11L8IyWqn+OUQaNMNH/GrgAyvWSEpEYS4180szcAzGzIzP40s7+Ap5mkJms5zGwg7PPAmySahgrNv7DPx1M4ygJgg5kNQTa9DJTyLnP5VdJNwKXAdaEwIHSp/BDSX5D0rZ8YS2OZ+5xFPw8DrgJeLcRi+jleGUQF82fsCiCzS0aEfsBngK1m9mgqnu5Tu5KwimksJE2T1FxIkwwMbibx8cZw2o3AW3EUFlH0yyprXqYo5d1q4IYw2+Is4KdUU3zSUfLSpXuBy83sQCrepuSdHEiaC3QDu+OoLHufVwOLJR0uaQ6JzvWTrW8MFwDbzGxPIRDLz1JlEJXMn5M9sj3OSPdCktHtXcCy2HpSus4haVp9BfSEbSHwArApxFcDsyLrnEsyk2IjsKXgIXA08AGwA3gfmBFZ5zTgB+CoVCy6lyQV0iDwB0mf6S2lvCOZXfFEyKubgDMi69xJ0udbyJ9PhXOvDnmhB9gAXBZZZ8n7DCwLfm4HFsTUGeLPAreOOTeKn2XKoIrlT18KwnEcp06J3QXkOI7jRMIrAMdxnDrFKwDHcZw6xSsAx3GcOsUrAMdxnDrFKwDHcZw6xSsAx3GcOuVvgBrgqHS3VdoAAAAASUVORK5CYII=\n",
            "text/plain": [
              "<Figure size 432x288 with 1 Axes>"
            ]
          },
          "metadata": {
            "tags": [],
            "needs_background": "light"
          }
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "7g_uzIBAy2xE",
        "colab_type": "code",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 34
        },
        "outputId": "333f0dd4-6e43-4d8e-d7af-7266568f46b2"
      },
      "source": [
        "from google.colab import drive\n",
        "drive.mount('/content/drive')\n",
        "\n",
        "np.save('/content/drive/My Drive/db/dqn/ddqn/rewards',rewards)\n",
        "np.save('/content/drive/My Drive/db/dqn/ddqn/TRAIN_END',TRAIN_END)"
      ],
      "execution_count": 14,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Drive already mounted at /content/drive; to attempt to forcibly remount, call drive.mount(\"/content/drive\", force_remount=True).\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "w_Rrp12ml7zH",
        "colab_type": "text"
      },
      "source": [
        "**Changes**  \n",
        "These are all the same changes as the DQN notebook with the exception of the update weights parameter.  \n",
        "*hyper parameters*: You can alter alpha, gamma, batch size, and episode length to see what differences the algorithm returns.  \n",
        "*Training End*: You can also change the line where I only check the last 5 runs before switching to testing mode (if len(rewards) > 5 and np.average(rewards[-5:]) > 195:) as that doesn't prove it was solved. The reason I did this was because I wanted to limit the amount of runs I made.  \n",
        "*Update Weights*: I call 'dqn.update_target_from_model()' after every episode. You can adjust this to run at different times. I have done per step (no matter how long the episode ran) and I have seen it done every X episodes. Feel free to try different things."
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "r1CRumb0l7zI",
        "colab_type": "text"
      },
      "source": [
        "**Conclusion**  \n",
        "This is a Double Deep Q-Network implementation. There are some changes you can make here and there but it follows the paper as close as I could. If you want to dive deeper you can see that the paper has graphs that dive deeper into the inner workings of the neural network.  "
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "KBYGuNO5l7zJ",
        "colab_type": "text"
      },
      "source": [
        "**Reference**  \n",
        "Van Hasselt, H., Guez, A., & Silver, D. (2016, February). Deep Reinforcement Learning with Double Q-Learning. In AAAI (Vol. 2, p. 5)"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "1Pu4DK6zl7zK",
        "colab_type": "text"
      },
      "source": [
        "## Project Conclusion  \n",
        "This completes the set of notebooks that cover the original Q-Learner and continues through recent updates. Once you have worked in this area for a while you can see how powerful that first update statement really was. With just some slight tweaks, these algorithms were able to achieve higher scores than even advanced Atari players.  \n",
        "\n",
        "I hope these notebooks have peaked your interest enough to continue your reinforcement journey."
      ]
    }
  ]
}