{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# COMP9417 19T2  Homework 2: Applying and Implementing Machine Learning"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "_Mon Jul 29 09:18:30 AEST 2019_"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The aim of this homework is to enable you to:\n",
    "\n",
    "- **apply** parameter search for machine learning algorithms implemented in the Python [scikit-learn](http://scikit-learn.org/stable/index.html) machine learning library\n",
    "- answer questions based on your **analysis** and **interpretation** of the empirical results of such applications, using your knowledge of machine learning\n",
    "- **complete** an implementation of a different version of a learning algorithm you have previously seen\n",
    "\n",
    "After completing this homework you will be able to:\n",
    "\n",
    "- set up a simple grid search over different hyper-parameter settings based on $k$-fold cross-validation to obtain  performance measures on different datasets\n",
    "- compare the performance measures of different algorithm settings \n",
    "- propose properties of algorithms and their hyper-parameters, or datasets, which\n",
    "  may lead to performance differences being observed\n",
    "- suggest reasons for actual observed performance differences in terms of\n",
    "  properties of algorithms, parameter settings or datasets.\n",
    "- read and understand incomplete code for a learning algorithm to the point of being able to complete the implementation and run it successfully on a dataset.\n",
    "\n",
    "There are a total of *10 marks* available.\n",
    "Each homework mark is worth *0.5 course mark*, i.e., homework marks will be scaled\n",
    "to a **course mark out of 5** to contribute to the course total.\n",
    "\n",
    "Deadline: 17:59:59, Monday August  5, 2019.\n",
    "\n",
    "Submission will be via the CSE *give* system (see below).\n",
    "\n",
    "Late penalties: one mark will be deducted from the total for each day late, up to a total of five days. If six or more days late, no marks will be given.\n",
    "\n",
    "Recall the guidance regarding plagiarism in the course introduction: this applies to this homework and if evidence of plagiarism is detected it may result in penalties ranging from loss of marks to suspension.\n",
    "\n",
    "### Format of the questions\n",
    "\n",
    "There are 2 questions in this homework. Question 1 requires answering some multiple-choice questions in the file [*answers.txt*](http://www.cse.unsw.edu.au/~cs9417/19T2/hw2/answers.txt). Both questions require you to copy and paste text into the file [*answers.txt*](http://www.cse.unsw.edu.au/~cs9417/19T2/hw2/answers.txt). This file **MUST CONTAIN ONLY PLAIN TEXT WITH NO SPECIAL CHARACTERS**.\n",
    "\n",
    "This file will form your submission.\n",
    "\n",
    "In summary, your submission will comprise a single file which should be named as follows:\n",
    "```\n",
    "answers.txt\n",
    "```\n",
    "Please note: files in any format other than plain text **cannot be accepted**.\n",
    "\n",
    "Submit your files using ```give```. On a CSE Linux machine, type the following on the command-line:\n",
    "```\n",
    "$ give cs9417 hw2 answers.txt\n",
    "```\n",
    "\n",
    "Alternatively, you can submit using the web-based interface to ```give```.\n",
    "\n",
    "### Datasets\n",
    "\n",
    "You can download the datasets required for the homework [here](http://www.cse.unsw.edu.au/~cs9417/hw2/datasets.zip).\n",
    "Note: you will need to ensure the dataset files are in the same directory from which you are running this notebook.\n",
    "\n",
    "**Please Note**: this homework uses some datasets in the Attribute-Relation File Format (.arff). To load datasets from '.arff' formatted files, you will need to have installed the ```liac-arff``` package. You can do this using ```pip``` at the command-line, as follows:\n",
    "\n",
    "```\n",
    "$ pip install liac-arff\n",
    "```"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Question 1 – Overfitting avoidance [Total: 3 marks]\n",
    "\n",
    "Dealing with noisy data is a key issue in machine learning. Unfortunately, even algorithms that have noise-handling mechanisms built-in, like decision trees, can overfit noisy data, unless their \"overfitting avoidance\" or *regularization* hyper-parameters are set properly.\n",
    "\n",
    "You will be using datasets that have had various amounts of \"class noise\" added\n",
    "by randomly changing the actual class value to a different one for a\n",
    "specified percentage of the training data.\n",
    "Here we will specify three arbitrarily chosen levels of noise: low\n",
    "($20\\%$), medium ($50\\%$) and high ($80\\%$).\n",
    "The learning algorithm must try to \"see through\" this noise and learn\n",
    "the best model it can, which is then evaluated on test data *without*\n",
    "added noise to evaluate how well it has avoided fitting the noise.\n",
    "\n",
    "We will also let the algorithm do a limited _grid search_ using cross-validation\n",
    "for the best *over-fitting avoidance* parameter settings on each training set.\n",
    "\n",
    "### Running the classifiers\n",
    "\n",
    "**1(a). [1 mark]** \n",
    "\n",
    "Run the code section in the notebook cells below. This will generate a table of results, which you should copy and paste **WITHOUT MODIFICATION** into the file [*answers.txt*](http://www.cse.unsw.edu.au/~cs9417/19T2/hw2/answers.txt)\n",
    "as your answer for \"Question 1(a)\". \n",
    "\n",
    "The output of the code section is a table, which represents the percentage accuracy of classification for the decision tree algorithm. The first column contains the result of the \"Default\" classifier, which is the decision tree algorithm with default parameter settings running on each of the datasets which have had $50\\%$ noise added. From the second column on, in each column the results are obtained by running the decision tree algorithm on $0\\%$, $20\\%$, $50\\%$ and $80\\%$ noise added to each of the datasets, and in the parentheses is shown the result of a [grid search](http://en.wikipedia.org/wiki/Hyperparameter_optimization) that has been applied to determine the best value for a basic parameter of the decision tree algorithm, namely [min_samples_leaf](http://scikit-learn.org/stable/modules/generated/sklearn.tree.DecisionTreeClassifier.html) i.e., the minimum number of examples that can be used to make a prediction in the tree, on that dataset. \n",
    "\n",
    "### Result interpretation\n",
    "Answer these questions in the file called [*answers.txt*](http://www.cse.unsw.edu.au/~cs9417/19T2/hw2/answers.txt). Your answers must be based on the results table you saved in \"Question 1(a)\".\n",
    "\n",
    "**1(b). [1 mark]** Refer to [*answers.txt*](http://www.cse.unsw.edu.au/~cs9417/19T2/hw2/answers.txt).\n",
    "\n",
    "**1(c). [1 mark]** Refer to [*answers.txt*](http://www.cse.unsw.edu.au/~cs9417/19T2/hw2/answers.txt)."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Code for question 1\n",
    "\n",
    "It is only necessary to run the following code to answer the question, but you should also go through it to make sure you know what is going on."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 1,
   "metadata": {},
   "outputs": [],
   "source": [
    "# Code for question 1\n",
    "\n",
    "import arff, numpy as np\n",
    "import pandas as pd\n",
    "from sklearn.base import TransformerMixin\n",
    "from sklearn import tree\n",
    "from sklearn import preprocessing\n",
    "from sklearn.model_selection import train_test_split\n",
    "from sklearn import svm, datasets\n",
    "from sklearn.model_selection import GridSearchCV\n",
    "from sklearn.tree import DecisionTreeClassifier\n",
    "from sklearn.metrics import accuracy_score\n",
    "import sys\n",
    "import warnings"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "metadata": {},
   "outputs": [],
   "source": [
    "# fixed random seed\n",
    "np.random.seed(1)\n",
    "\n",
    "def warn(*args, **kwargs):\n",
    "    pass\n",
    "\n",
    "def label_enc(labels):\n",
    "    le = preprocessing.LabelEncoder()\n",
    "    le.fit(labels)\n",
    "    return le\n",
    "\n",
    "def features_encoders(features,categorical_features='all'):\n",
    "    n_samples, n_features = features.shape\n",
    "    label_encoders = [preprocessing.LabelEncoder() for _ in range(n_features)]\n",
    "\n",
    "    X_int = np.zeros_like(features, dtype=np.int)\n",
    "\n",
    "    for i in range(n_features):\n",
    "        feature_i = features[:, i]\n",
    "        label_encoders[i].fit(feature_i)\n",
    "        X_int[:, i] = label_encoders[i].transform(feature_i)\n",
    "        \n",
    "    enc = preprocessing.OneHotEncoder(categorical_features=categorical_features)\n",
    "    return enc.fit(X_int),label_encoders\n",
    "\n",
    "def feature_transform(features,label_encoders, one_hot_encoder):\n",
    "    \n",
    "    n_samples, n_features = features.shape\n",
    "    X_int = np.zeros_like(features, dtype=np.int)\n",
    "    \n",
    "    for i in range(n_features):\n",
    "        feature_i = features[:, i]\n",
    "        X_int[:, i] = label_encoders[i].transform(feature_i)\n",
    "\n",
    "    return one_hot_encoder.transform(X_int).toarray()\n",
    "\n",
    "warnings.warn = warn"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "metadata": {},
   "outputs": [],
   "source": [
    "class DataFrameImputer(TransformerMixin):\n",
    "\n",
    "    def fit(self, X, y=None):\n",
    "\n",
    "        self.fill = pd.Series([X[c].value_counts().index[0]\n",
    "            if X[c].dtype == np.dtype('O') else X[c].mean() for c in X],\n",
    "            index=X.columns)\n",
    "\n",
    "        return self\n",
    "\n",
    "    def transform(self, X, y=None):\n",
    "        return X.fillna(self.fill)\n",
    "\n",
    "\n",
    "def load_data(path):\n",
    "    dataset = arff.load(open(path, 'r'))\n",
    "    data = np.array(dataset['data'])\n",
    "    data = pd.DataFrame(data)\n",
    "    data = DataFrameImputer().fit_transform(data).values\n",
    "    attr = dataset['attributes']\n",
    "\n",
    "    # mask categorical features\n",
    "    masks = []\n",
    "    for i in range(len(attr)-1):\n",
    "        if attr[i][1] != 'REAL':\n",
    "            masks.append(i)\n",
    "    return data, masks\n",
    "\n",
    "def preprocess(data,masks, noise_ratio):\n",
    "    # split data\n",
    "    train_data, test_data = train_test_split(data,test_size=0.3,random_state=0)\n",
    "\n",
    "    # test data\n",
    "    test_features = test_data[:,0:test_data.shape[1]-1]\n",
    "    test_labels = test_data[:,test_data.shape[1]-1]\n",
    "\n",
    "    # training data\n",
    "    features = train_data[:,0:train_data.shape[1]-1]\n",
    "    labels = train_data[:,train_data.shape[1]-1]\n",
    "\n",
    "    classes = list(set(labels))\n",
    "    # categorical features need to be encoded\n",
    "    if len(masks):\n",
    "        one_hot_enc, label_encs = features_encoders(data[:,0:data.shape[1]-1],masks)\n",
    "        test_features = feature_transform(test_features,label_encs,one_hot_enc)\n",
    "        features = feature_transform(features,label_encs,one_hot_enc)\n",
    "\n",
    "    le = label_enc(data[:,data.shape[1]-1])\n",
    "    labels = le.transform(train_data[:,train_data.shape[1]-1])\n",
    "    test_labels = le.transform(test_data[:,test_data.shape[1]-1])\n",
    "    \n",
    "    # add noise\n",
    "    np.random.seed(1234)\n",
    "    noise = np.random.randint(len(classes)-1, size=int(len(labels)*noise_ratio))+1\n",
    "    \n",
    "    noise = np.concatenate((noise,np.zeros(len(labels) - len(noise),dtype=np.int)))\n",
    "    labels = (labels + noise) % len(classes)\n",
    "\n",
    "    return features,labels,test_features,test_labels"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "                                             Decision Tree Results                                              \n",
      "----------------------------------------------------------------------------------------------------------------\n",
      "    Dataset     |     Default      |        0%        |       20%        |       50%        |       80%        |\n",
      "----------------------------------------------------------------------------------------------------------------\n",
      "\n",
      "datasets/balance-scale|  36.70% ( 2)     |  76.06% ( 2)     |  71.28% (12)     |  65.43% (27)     |  18.09% (27)     |\n",
      "\n",
      "datasets/primary-tumor|  25.49% ( 2)     |  37.25% (12)     |  42.16% (12)     |  43.14% (12)     |  26.47% ( 7)     |\n",
      "\n",
      "datasets/glass  |  44.62% ( 2)     |  69.23% ( 7)     |  66.15% (22)     |  35.38% (17)     |  29.23% (17)     |\n",
      "\n",
      "datasets/heart-h|  35.96% ( 2)     |  67.42% ( 7)     |  78.65% (22)     |  56.18% (17)     |  20.22% (27)     |\n",
      "\n",
      "\n",
      "\n"
     ]
    }
   ],
   "source": [
    "# load data\n",
    "paths = ['balance-scale','primary-tumor',\n",
    "         'glass','heart-h']\n",
    "noise = [0,0.2,0.5,0.8]\n",
    "\n",
    "scores = []\n",
    "params = []\n",
    "\n",
    "for path in paths:\n",
    "    score = []\n",
    "    param = []\n",
    "    path += '.arff'\n",
    "    data, masks = load_data(path)\n",
    "    \n",
    "    # training on data with 50% noise and default parameters\n",
    "    features, labels, test_features, test_labels = preprocess(data, masks, 0.5)\n",
    "    tree = DecisionTreeClassifier(random_state=0,min_samples_leaf=2, min_impurity_decrease=0)\n",
    "    tree.fit(features, labels)\n",
    "    tree_preds = tree.predict(test_features)\n",
    "    tree_performance = accuracy_score(test_labels, tree_preds)\n",
    "    score.append(tree_performance)\n",
    "    param.append(tree.get_params()['min_samples_leaf'])\n",
    "    \n",
    "    # training on data with noise levels of 0%, 20%, 50% and 80%\n",
    "    for noise_ratio in noise:\n",
    "        features, labels, test_features, test_labels = preprocess(data, masks, noise_ratio)\n",
    "        param_grid = {'min_samples_leaf': np.arange(2,30,5)}\n",
    "\n",
    "        grid_tree = GridSearchCV(DecisionTreeClassifier(random_state=0), param_grid,cv=10,return_train_score=True)\n",
    "        grid_tree.fit(features, labels)\n",
    "\n",
    "        estimator = grid_tree.best_estimator_\n",
    "        tree_preds = grid_tree.predict(test_features)\n",
    "        tree_performance = accuracy_score(test_labels, tree_preds)\n",
    "        score.append(tree_performance)\n",
    "        param.append(estimator.get_params()['min_samples_leaf'])\n",
    "\n",
    "    scores.append(score)\n",
    "    params.append(param)\n",
    "\n",
    "# print the results\n",
    "header = \"{:^112}\".format(\"Decision Tree Results\") + '\\n' + '-' * 112  + '\\n' + \\\n",
    "\"{:^15} | {:^16} | {:^16} | {:^16} | {:^16} | {:^16} |\".format(\"Dataset\", \"Default\", \"0%\", \"20%\", \"50%\", \"80%\") + \\\n",
    " '\\n' + '-' * 112  + '\\n'\n",
    "\n",
    "# print result table\n",
    "print(header)\n",
    "for i in range(len(scores)):\n",
    "    #scores = score_list[i][1]\n",
    "    print(\"{:<16}\".format(paths[i]),end=\"\")\n",
    "    for j in range(len(params[i])):\n",
    "        print(\"|  {:>6.2%} ({:>2})     \" .format(scores[i][j],params[i][j]),end=\"\")\n",
    "    print('|\\n')\n",
    "print('\\n')"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Question 2 – Implementation of a simple RNN [Total: 7 marks]"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "In this question, you will implement a simple recurrent neural network (RNN).\n",
    "\n",
    "Recurrent neural networks are commonly used when the input data has temporal dependencies among consecutive observations, for example time series and text data. With such data, having knowledge of the previous data points in addition to the current helps in prediction."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Recurrent neural networks\n",
    "\n",
    "RNNs are suitable in such scenarios because they keep a state derived from all previously seen data, which in combination with the current input is used to predict the output.\n",
    "\n",
    "In general, recurrent neural networks work like the following:\n",
    "\n",
    "![Screen%20Shot%202019-07-23%20at%2010.51.21%20am.png](files/screenshots/rnn.png)\n",
    "_(Image credit: Goodfellow, Bengio & Courville (2015) - Deep Learning)_\n",
    "\n",
    "Here, $x$ is the input, and $h$ is the hidden state maintained by the RNN. For each input in the sequence, the RNN takes both the previous state $h_{t-1}$ and the current input $x_t$ to do the prediction.\n",
    "\n",
    "Notice there is only one set of weights in the RNN, but this set of weights is used for the whole sequence of input. In effect, the RNN is chained with itself a number of times equalling the length of the input.\n",
    "\n",
    "Thus for the purpose of training the RNN, a common practice is to unfold the computational graph, and run the standard back-propagation thereon. This technique is also known as back-propagation through time."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Your task\n",
    "\n",
    "Given a dataset of partial words (words without the last character), your task is to implement an RNN to predict the last character in the word. Specifically, your RNN will have the first 9 characters of a word as its input, and you need to predict the 10th character. If there are fewer than 10 characters in a word, spaces are used to pad it.\n",
    "\n",
    "Most of the code needed is provided below, what you need to do is to implement the back-propagation through time section in ```NeuralNetwork.fit()```.\n",
    "\n",
    "There are four sections marked ```TO DO: ``` where you need to add your own code to complete a working implementation.\n",
    "\n",
    "**HINT:** review the implementation of the ```backpropagate``` method of the ```NeuralNetwork``` class in the code in the notebook for Lab6 on \"Neural Learning\". That should give you a starting point for your implementation.\n",
    "\n",
    "Ensure that you have the following files you need for training and testing in the directory in which you run this notebook:\n",
    "```\n",
    "training_input.txt\n",
    "training_label.txt\n",
    "testing_input.txt\n",
    "testing_label.txt\n",
    "```\n",
    "\n",
    "**HINT:** if your implementation is correct your output should look something like the following:\n",
    "\n",
    "![Screen%20Shot%202019-07-28%20at%205.24.11%20pm.png](files/screenshots/testing.png)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Submission\n",
    "\n",
    "Submit a text file ```RNN_solutions.txt```, containing only the four sections (together with the comments)\n",
    "\n",
    "Sample submission:\n",
    "\n",
    "```\n",
    "# setup for the current step\n",
    "layer_input = []\n",
    "weight = []\n",
    "\n",
    "# calculate gradients\n",
    "gradients, dW, db = [], [], []\n",
    "\n",
    "# update weights\n",
    "self.weights[0] += 0\n",
    "self.biases[0] += 0\n",
    "\n",
    "# setup for the next step\n",
    "previous_gradients = []\n",
    "layer_output = []\n",
    "```"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Marking\n",
    "\n",
    "If your implementation runs and obtains a testing accuracy of more than 0.5 then your submission will be given full marks.\n",
    "\n",
    "Otherwise, each submitted correct section of your code will receive some part of the total marks, as follows:\n",
    "\n",
    "```\n",
    "# setup for the current step [2 marks]\n",
    "layer_input = []\n",
    "weight = []\n",
    "\n",
    "# calculate gradients [1 mark]\n",
    "gradients, dW, db = [], [], []\n",
    "\n",
    "# update weights [2 marks]\n",
    "self.weights[0] += 0\n",
    "self.biases[0] += 0\n",
    "\n",
    "# setup for the next step [2 marks]\n",
    "previous_gradients = []\n",
    "layer_output = []\n",
    "```\n",
    "\n",
    "**NOTE:** it is OK to split your code for each section over multiple lines."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "metadata": {},
   "outputs": [],
   "source": [
    "import time\n",
    "import numpy"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "metadata": {},
   "outputs": [],
   "source": [
    "# helper functions to read data\n",
    "def read_data(file_name, encoding_function, expected_length):\n",
    "    with open(file_name, 'r') as f:\n",
    "        return numpy.array([encoding_function(row) for row in f.read().split('\\n') if len(row) == expected_length])\n",
    "\n",
    "def encode_string(s):\n",
    "    return [one_hot_encode_character(c) for c in s]\n",
    "\n",
    "def one_hot_encode_character(c):\n",
    "    base = [0] * 26\n",
    "    index = ord(c) - ord('a')\n",
    "    if index >= 0 and index <= 25:\n",
    "        base[index] = 1\n",
    "    return base\n",
    "\n",
    "def reverse_one_hot_encode(v):\n",
    "    return chr(numpy.argmax(v) + ord('a')) if max(v) > 0 else ' '"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "metadata": {},
   "outputs": [],
   "source": [
    "# functions used in the neural network\n",
    "def sigmoid(x):\n",
    "    return 1 / (1 + numpy.exp(-x))\n",
    "\n",
    "def sigmoid_derivative(x):\n",
    "    return (1 - x) * x\n",
    "\n",
    "def argmax(x):\n",
    "    return numpy.argmax(x, axis=1)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 34,
   "metadata": {},
   "outputs": [],
   "source": [
    "class NeuralNetwork:\n",
    "    def __init__(self, learning_rate=2, epochs=5000, input_size=9, hidden_layer_size=64):\n",
    "        # activation function and its derivative to be used in backpropagation\n",
    "        self.activation_function = sigmoid\n",
    "        self.derivative_of_activation_function = sigmoid_derivative\n",
    "        self.map_output_to_prediction = argmax\n",
    "\n",
    "        # parameters\n",
    "        self.learning_rate = learning_rate\n",
    "        self.epochs = epochs\n",
    "        self.input_size = input_size\n",
    "        self.hidden_layer_size = hidden_layer_size\n",
    "\n",
    "        # initialisation\n",
    "        numpy.random.seed(77)\n",
    "\n",
    "    def fit(self, X, y):\n",
    "        # reset timer\n",
    "        timer_base = time.time()\n",
    "        \n",
    "        # initialise the weights of the NN\n",
    "        input_dim = X.shape[2] + self.hidden_layer_size\n",
    "        output_dim = y.shape[1]\n",
    "\n",
    "        self.weights, self.biases = [], []\n",
    "        previous_layer_size = input_dim\n",
    "        \n",
    "        for current_layer_size in [self.hidden_layer_size, output_dim]:\n",
    "            # random initial weights and zero biases\n",
    "            weights_of_current_layer = numpy.random.randn(previous_layer_size, current_layer_size)\n",
    "            bias_of_current_layer = numpy.zeros((1, current_layer_size))\n",
    "            self.weights.append(weights_of_current_layer)\n",
    "            self.biases.append(bias_of_current_layer)\n",
    "            previous_layer_size = current_layer_size\n",
    "\n",
    "        # train the NN\n",
    "        self.accuracy_log = []\n",
    "\n",
    "        for epoch in range(self.epochs + 1):\n",
    "            outputs = self.forward_propagate(X)\n",
    "            prediction = outputs.pop()#(2225,26)\n",
    "            if epoch % 100 == 0:\n",
    "                accuracy = self.evaluate(prediction, y)\n",
    "                print(f\"In iteration {epoch}, training accuracy is {accuracy}.\")\n",
    "                self.accuracy_log.append(accuracy)\n",
    "\n",
    "            # first step of back-propagation\n",
    "            dEdz = y - prediction#(2225,26)\n",
    "            layer_input = outputs.pop()#(2225,64)\n",
    "            layer_output = prediction\n",
    "\n",
    "            # calculate gradients\n",
    "            dEds, dW, db = self.derivatives_of_last_layer(dEdz, layer_output, layer_input)\n",
    "            \n",
    "            self.weights[1] += self.learning_rate / X.shape[0] * dW\n",
    "            self.biases[1] += self.learning_rate / X.shape[0] * db\n",
    "            # setup for the next step\n",
    "            previous_gradients = dEds#(2225,26)\n",
    "            layer_output = layer_input#(2226,64)\n",
    "            \n",
    "            # back-propagation through time (unrolled)\n",
    "            for step in range(self.input_size - 1, -1, -1):\n",
    "                if step!=0:\n",
    "                    hiddenLayer = outputs.pop() #(2225,64)\n",
    "                else:\n",
    "                    hiddenLayer = numpy.zeros((X.shape[0], self.hidden_layer_size)) #(2225,64)\n",
    "                layer_input=np.concatenate((hiddenLayer, X[:, step, :]), axis=1)#(2225,90)\n",
    "                if step==self.input_size - 1:\n",
    "                    weight= self.weights[1] #(64,26)\n",
    "                else:\n",
    "                    weight= self.weights[0] #(90,64)\n",
    "                weight = weight[:64,:]\n",
    "\n",
    "                gradients, dW, db=self.derivatives_of_hidden_layer(previous_gradients,layer_output,layer_input,weight)\n",
    "                self.weights[0] += self.learning_rate / X.shape[0] * dW #(90,64)\n",
    "                self.biases[0] += self.learning_rate / X.shape[0] * db\n",
    "                previous_gradients = gradients #(2225,64)\n",
    "                layer_output = hiddenLayer\n",
    "        print(f\"Finished training in {time.time() - timer_base} seconds\")\n",
    "\n",
    "    def test(self, X, y, verbose=True):\n",
    "        predictions = self.forward_propagate(X)[-1]\n",
    "\n",
    "        if verbose:\n",
    "            for index in range(len(predictions)):\n",
    "                prefix = ''.join(reverse_one_hot_encode(v) for v in X[index])\n",
    "                print(f\"Expected {prefix + reverse_one_hot_encode(y[index])}, predicted {prefix + reverse_one_hot_encode(predictions[index])}\")\n",
    "\n",
    "        print(f\"Testing accuracy: {self.evaluate(predictions, y)}\")\n",
    "\n",
    "    def evaluate(self, predictions, target_values):\n",
    "        successful_predictions = numpy.where(self.map_output_to_prediction(predictions) == self.map_output_to_prediction(target_values))\n",
    "        return successful_predictions[0].shape[0] / len(predictions) if successful_predictions else 0\n",
    "\n",
    "\n",
    "    def forward_propagate(self, X):\n",
    "        # initial states\n",
    "        current_state = numpy.zeros((X.shape[0], self.hidden_layer_size))#(2225,64)\n",
    "        outputs = [current_state]\n",
    "        \n",
    "        # forward propagation through time (unrolled)\n",
    "        for step in range(self.input_size):#[0-8]\n",
    "            x = numpy.concatenate((current_state, X[:, step, :]), axis=1)#(2225,64+26)\n",
    "            \n",
    "            current_state = self.apply_neuron(self.weights[0], self.biases[0], x)#(2225,90)*(90,64)       \n",
    "            outputs.append(current_state)\n",
    "            #print('intermidate output ',current_state.shape)\n",
    "\n",
    "        # the last layer\n",
    "        output = self.apply_neuron(self.weights[1], self.biases[1], current_state)\n",
    "        outputs.append(output)\n",
    "        #print('final layer outpout, ',output.shape)\n",
    "        return outputs\n",
    "\n",
    "    def apply_neuron(self, w, b, x):\n",
    "        return self.activation_function(numpy.dot(x, w) + b)\n",
    "\n",
    "    def derivatives_of_last_layer(self, dEdz, layer_output, layer_input):\n",
    "        #print(\"####layer_shape\",layer_input.shape)\n",
    "        dEds = self.derivative_of_activation_function(layer_output) * dEdz\n",
    "        dW = numpy.dot(layer_input.T, dEds)\n",
    "        db = numpy.sum(dEds, axis=0, keepdims=True)\n",
    "        return dEds, dW, db\n",
    "\n",
    "    def derivatives_of_hidden_layer(self, layer_difference, layer_output, layer_input, weight):\n",
    "        gradients = self.derivative_of_activation_function(layer_output) * numpy.dot(layer_difference, weight.T)\n",
    "        dW = numpy.dot(layer_input.T, gradients)\n",
    "        db = numpy.sum(gradients, axis=0, keepdims=True)\n",
    "        return gradients, dW, db\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 35,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "In iteration 0, training accuracy is 0.005393258426966292.\n",
      "In iteration 100, training accuracy is 0.0.\n",
      "In iteration 200, training accuracy is 0.0.\n",
      "In iteration 300, training accuracy is 0.0.\n",
      "In iteration 400, training accuracy is 0.0.\n",
      "In iteration 500, training accuracy is 0.3096629213483146.\n",
      "In iteration 600, training accuracy is 0.3649438202247191.\n",
      "In iteration 700, training accuracy is 0.37258426966292135.\n",
      "In iteration 800, training accuracy is 0.39280898876404496.\n",
      "In iteration 900, training accuracy is 0.4049438202247191.\n",
      "In iteration 1000, training accuracy is 0.42067415730337077.\n",
      "In iteration 1100, training accuracy is 0.4341573033707865.\n",
      "In iteration 1200, training accuracy is 0.4310112359550562.\n",
      "In iteration 1300, training accuracy is 0.4543820224719101.\n",
      "In iteration 1400, training accuracy is 0.46202247191011236.\n",
      "In iteration 1500, training accuracy is 0.46831460674157305.\n",
      "In iteration 1600, training accuracy is 0.4898876404494382.\n",
      "In iteration 1700, training accuracy is 0.5092134831460674.\n",
      "In iteration 1800, training accuracy is 0.5123595505617977.\n",
      "In iteration 1900, training accuracy is 0.529438202247191.\n",
      "In iteration 2000, training accuracy is 0.5523595505617978.\n",
      "In iteration 2100, training accuracy is 0.5608988764044944.\n",
      "In iteration 2200, training accuracy is 0.5730337078651685.\n",
      "In iteration 2300, training accuracy is 0.5582022471910112.\n",
      "In iteration 2400, training accuracy is 0.5635955056179776.\n",
      "In iteration 2500, training accuracy is 0.5797752808988764.\n",
      "In iteration 2600, training accuracy is 0.5829213483146067.\n",
      "In iteration 2700, training accuracy is 0.5950561797752809.\n",
      "In iteration 2800, training accuracy is 0.6035955056179775.\n",
      "In iteration 2900, training accuracy is 0.6049438202247192.\n",
      "In iteration 3000, training accuracy is 0.6107865168539326.\n",
      "In iteration 3100, training accuracy is 0.6206741573033708.\n",
      "In iteration 3200, training accuracy is 0.6229213483146068.\n",
      "In iteration 3300, training accuracy is 0.6256179775280899.\n",
      "In iteration 3400, training accuracy is 0.6408988764044944.\n",
      "In iteration 3500, training accuracy is 0.6426966292134831.\n",
      "In iteration 3600, training accuracy is 0.6507865168539326.\n",
      "In iteration 3700, training accuracy is 0.6566292134831461.\n",
      "In iteration 3800, training accuracy is 0.6696629213483146.\n",
      "In iteration 3900, training accuracy is 0.6651685393258427.\n",
      "In iteration 4000, training accuracy is 0.6669662921348315.\n",
      "In iteration 4100, training accuracy is 0.6755056179775281.\n",
      "In iteration 4200, training accuracy is 0.682247191011236.\n",
      "In iteration 4300, training accuracy is 0.6979775280898877.\n",
      "In iteration 4400, training accuracy is 0.7051685393258427.\n",
      "In iteration 4500, training accuracy is 0.706067415730337.\n",
      "In iteration 4600, training accuracy is 0.708314606741573.\n",
      "In iteration 4700, training accuracy is 0.7123595505617978.\n",
      "In iteration 4800, training accuracy is 0.7137078651685393.\n",
      "In iteration 4900, training accuracy is 0.7186516853932584.\n",
      "In iteration 5000, training accuracy is 0.7155056179775281.\n",
      "Finished training in 152.8129439353943 seconds\n"
     ]
    }
   ],
   "source": [
    "training_input = read_data(\"training_input.txt\", encode_string, 9)\n",
    "training_label = read_data(\"training_label.txt\", one_hot_encode_character, 1)\n",
    "\n",
    "model = NeuralNetwork()\n",
    "model.fit(training_input, training_label)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 36,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Expected         so, predicted         so\n",
      "Expected         is, predicted         is\n",
      "Expected         it, predicted         is\n",
      "Expected        not, predicted        not\n",
      "Expected       with, predicted       with\n",
      "Expected         me, predicted         my\n",
      "Expected         as, predicted         as\n",
      "Expected       with, predicted       with\n",
      "Expected       that, predicted       that\n",
      "Expected       muse, predicted       must\n",
      "Expected     stirrd, predicted     stirrs\n",
      "Expected         by, predicted         be\n",
      "Expected          a, predicted          o\n",
      "Expected    painted, predicted    painted\n",
      "Expected     beauty, predicted     beauty\n",
      "Expected         to, predicted         to\n",
      "Expected        his, predicted        his\n",
      "Expected      verse, predicted      verse\n",
      "Expected        who, predicted        who\n",
      "Expected     heaven, predicted     heaves\n",
      "Expected     itself, predicted     itsels\n",
      "Expected        for, predicted        for\n",
      "Expected   ornament, predicted   ornament\n",
      "Expected       doth, predicted       doth\n",
      "Expected        use, predicted        use\n",
      "Expected        and, predicted        and\n",
      "Expected      every, predicted      every\n",
      "Expected       fair, predicted       fair\n",
      "Expected       with, predicted       with\n",
      "Expected        his, predicted        his\n",
      "Expected       fair, predicted       fair\n",
      "Expected       doth, predicted       doth\n",
      "Expected   rehearse, predicted   rehearst\n",
      "Expected     making, predicted     making\n",
      "Expected          a, predicted          o\n",
      "Expected couplement, predicted couplement\n",
      "Expected         of, predicted         or\n",
      "Expected      proud, predicted      prous\n",
      "Expected    compare, predicted    compare\n",
      "Expected       with, predicted       with\n",
      "Expected        sun, predicted        suh\n",
      "Expected        and, predicted        and\n",
      "Expected       moon, predicted       moor\n",
      "Expected       with, predicted       with\n",
      "Expected      earth, predicted      earts\n",
      "Expected        and, predicted        and\n",
      "Expected       seas, predicted       seas\n",
      "Expected       rich, predicted       rich\n",
      "Expected       gems, predicted       geme\n",
      "Expected       with, predicted       with\n",
      "Expected     aprils, predicted     aprilf\n",
      "Expected  firstborn, predicted  firstbors\n",
      "Expected    flowers, predicted    flowers\n",
      "Expected        and, predicted        and\n",
      "Expected        all, predicted        all\n",
      "Expected     things, predicted     things\n",
      "Expected       rare, predicted       rard\n",
      "Expected       that, predicted       that\n",
      "Expected    heavens, predicted    heaveng\n",
      "Expected        air, predicted        air\n",
      "Expected         in, predicted         is\n",
      "Expected       this, predicted       this\n",
      "Expected       huge, predicted       huge\n",
      "Expected    rondure, predicted    rondurs\n",
      "Expected       hems, predicted       heme\n",
      "Expected          o, predicted          o\n",
      "Expected        let, predicted        let\n",
      "Expected         me, predicted         my\n",
      "Expected       true, predicted       trut\n",
      "Expected         in, predicted         is\n",
      "Expected       love, predicted       love\n",
      "Expected        but, predicted        but\n",
      "Expected      truly, predicted      truld\n",
      "Expected      write, predicted      writh\n",
      "Expected        and, predicted        and\n",
      "Expected       then, predicted       thee\n",
      "Expected    believe, predicted    believe\n",
      "Expected         me, predicted         my\n",
      "Expected         my, predicted         my\n",
      "Expected       love, predicted       love\n",
      "Expected         is, predicted         is\n",
      "Expected         as, predicted         as\n",
      "Expected       fair, predicted       fair\n",
      "Expected         as, predicted         as\n",
      "Expected        any, predicted        and\n",
      "Expected    mothers, predicted    mothers\n",
      "Expected      child, predicted      child\n",
      "Expected     though, predicted     thougy\n",
      "Expected        not, predicted        not\n",
      "Expected         so, predicted         so\n",
      "Expected     bright, predicted     bright\n",
      "Expected         as, predicted         as\n",
      "Expected      those, predicted      those\n",
      "Expected       gold, predicted       gold\n",
      "Expected    candles, predicted    candles\n",
      "Expected       fixd, predicted       fixd\n",
      "Expected         in, predicted         is\n",
      "Expected    heavens, predicted    heaveng\n",
      "Expected        air, predicted        air\n",
      "Expected        let, predicted        let\n",
      "Expected       them, predicted       thee\n",
      "Expected        say, predicted        say\n",
      "Expected       more, predicted       more\n",
      "Expected       than, predicted       that\n",
      "Expected       like, predicted       like\n",
      "Expected         of, predicted         or\n",
      "Expected    hearsay, predicted    hearsas\n",
      "Expected       well, predicted       well\n",
      "Expected          i, predicted          o\n",
      "Expected       will, predicted       will\n",
      "Expected        not, predicted        not\n",
      "Expected     praise, predicted     praise\n",
      "Expected       that, predicted       that\n",
      "Expected    purpose, predicted    purpose\n",
      "Expected        not, predicted        not\n",
      "Expected         to, predicted         to\n",
      "Expected       sell, predicted       self\n",
      "Expected         my, predicted         my\n",
      "Expected      glass, predicted      glass\n",
      "Expected      shall, predicted      shall\n",
      "Expected        not, predicted        not\n",
      "Expected   persuade, predicted   persuade\n",
      "Expected         me, predicted         my\n",
      "Expected          i, predicted          o\n",
      "Expected         am, predicted         as\n",
      "Expected        old, predicted        old\n",
      "Expected         so, predicted         so\n",
      "Expected       long, predicted       long\n",
      "Expected         as, predicted         as\n",
      "Expected      youth, predicted      youth\n",
      "Expected        and, predicted        and\n",
      "Expected       thou, predicted       thou\n",
      "Expected        are, predicted        art\n",
      "Expected         of, predicted         or\n",
      "Expected        one, predicted        one\n",
      "Expected       date, predicted       datl\n",
      "Expected        but, predicted        but\n",
      "Expected       when, predicted       wher\n",
      "Expected         in, predicted         is\n",
      "Expected       thee, predicted       thee\n",
      "Expected      times, predicted      times\n",
      "Expected    furrows, predicted    furrows\n",
      "Expected          i, predicted          o\n",
      "Expected     behold, predicted     behold\n",
      "Expected       then, predicted       thee\n",
      "Expected       look, predicted       lood\n",
      "Expected          i, predicted          o\n",
      "Expected      death, predicted      deats\n",
      "Expected         my, predicted         my\n",
      "Expected       days, predicted       dayr\n",
      "Expected     should, predicted     should\n",
      "Expected    expiate, predicted    expiats\n",
      "Expected        for, predicted        for\n",
      "Expected        all, predicted        all\n",
      "Expected       that, predicted       that\n",
      "Expected     beauty, predicted     beauty\n",
      "Expected       that, predicted       that\n",
      "Expected       doth, predicted       doth\n",
      "Expected      cover, predicted      coved\n",
      "Expected       thee, predicted       thee\n",
      "Expected         is, predicted         is\n",
      "Expected        but, predicted        but\n",
      "Expected        the, predicted        the\n",
      "Expected     seemly, predicted     seemly\n",
      "Expected    raiment, predicted    raiment\n",
      "Expected         of, predicted         or\n",
      "Expected         my, predicted         my\n",
      "Expected      heart, predicted      heard\n",
      "Expected      which, predicted      which\n",
      "Expected         in, predicted         is\n",
      "Expected        thy, predicted        the\n",
      "Expected     breast, predicted     breast\n",
      "Expected       doth, predicted       doth\n",
      "Expected       live, predicted       live\n",
      "Expected         as, predicted         as\n",
      "Expected      thine, predicted      thine\n",
      "Expected         in, predicted         is\n",
      "Expected         me, predicted         my\n",
      "Expected        how, predicted        hom\n",
      "Expected        can, predicted        cas\n",
      "Expected          i, predicted          o\n",
      "Expected       then, predicted       thee\n",
      "Expected         be, predicted         be\n",
      "Expected      elder, predicted      elded\n",
      "Expected       than, predicted       that\n",
      "Expected       thou, predicted       thou\n",
      "Expected        art, predicted        art\n",
      "Expected          o, predicted          o\n",
      "Expected  therefore, predicted  therefore\n",
      "Expected       love, predicted       love\n",
      "Expected         be, predicted         be\n",
      "Expected         of, predicted         or\n",
      "Expected    thyself, predicted    thyself\n",
      "Expected         so, predicted         so\n",
      "Expected       wary, predicted       wart\n",
      "Expected         as, predicted         as\n",
      "Expected          i, predicted          o\n",
      "Expected        not, predicted        not\n",
      "Expected        for, predicted        for\n",
      "Expected     myself, predicted     myself\n",
      "Expected        but, predicted        but\n",
      "Expected        for, predicted        for\n",
      "Expected       thee, predicted       thee\n",
      "Expected       will, predicted       will\n",
      "Expected    bearing, predicted    bearing\n",
      "Expected        thy, predicted        the\n",
      "Expected      heart, predicted      heard\n",
      "Expected      which, predicted      which\n",
      "Expected          i, predicted          o\n",
      "Expected       will, predicted       will\n",
      "Expected       keep, predicted       keet\n",
      "Expected         so, predicted         so\n",
      "Expected      chary, predicted      chard\n",
      "Expected         as, predicted         as\n",
      "Expected     tender, predicted     tender\n",
      "Expected      nurse, predicted      nurst\n",
      "Expected        her, predicted        her\n",
      "Expected       babe, predicted       babe\n",
      "Expected       from, predicted       from\n",
      "Expected     faring, predicted     faring\n",
      "Expected        ill, predicted        ill\n",
      "Expected    presume, predicted    presume\n",
      "Expected        not, predicted        not\n",
      "Expected         on, predicted         or\n",
      "Expected        thy, predicted        the\n",
      "Expected      heart, predicted      heard\n",
      "Expected       when, predicted       wher\n",
      "Expected       mine, predicted       ming\n",
      "Expected         is, predicted         is\n",
      "Expected      slain, predicted      slais\n",
      "Expected       thou, predicted       thou\n",
      "Expected     gavest, predicted     gavest\n",
      "Expected         me, predicted         my\n",
      "Expected      thine, predicted      thine\n",
      "Expected        not, predicted        not\n",
      "Expected         to, predicted         to\n",
      "Expected       give, predicted       give\n",
      "Expected       back, predicted       bace\n",
      "Expected      again, predicted      agair\n",
      "Expected         as, predicted         as\n",
      "Expected         an, predicted         as\n",
      "Expected  unperfect, predicted  unperfect\n",
      "Expected      actor, predicted      actor\n",
      "Expected         on, predicted         or\n",
      "Expected        the, predicted        the\n",
      "Expected      stage, predicted      stage\n",
      "Expected        who, predicted        who\n",
      "Expected       with, predicted       with\n",
      "Expected        his, predicted        his\n",
      "Expected       fear, predicted       fear\n",
      "Expected         is, predicted         is\n",
      "Expected        put, predicted        puh\n",
      "Expected    besides, predicted    besided\n",
      "Expected        his, predicted        his\n",
      "Expected       part, predicted       part\n",
      "Expected         or, predicted         or\n",
      "Expected       some, predicted       some\n",
      "Expected     fierce, predicted     fiercd\n",
      "Expected      thing, predicted      thine\n",
      "Expected    replete, predicted    replets\n",
      "Expected       with, predicted       with\n",
      "Expected        too, predicted        tor\n",
      "Expected       much, predicted       much\n",
      "Expected       rage, predicted       rage\n",
      "Expected      whose, predicted      whose\n",
      "Expected  strengths, predicted  strengths\n",
      "Expected  abundance, predicted  abundance\n",
      "Expected    weakens, predicted    weakent\n",
      "Expected        his, predicted        his\n",
      "Expected        own, predicted        ows\n",
      "Expected      heart, predicted      heard\n",
      "Expected         so, predicted         so\n",
      "Expected          i, predicted          o\n",
      "Expected        for, predicted        for\n",
      "Expected       fear, predicted       fear\n",
      "Expected         of, predicted         or\n",
      "Expected      trust, predicted      truse\n",
      "Expected     forget, predicted     forget\n",
      "Expected         to, predicted         to\n",
      "Expected        say, predicted        say\n",
      "Expected        the, predicted        the\n",
      "Expected    perfect, predicted    perfect\n",
      "Expected   ceremony, predicted   ceremons\n",
      "Expected         of, predicted         or\n",
      "Expected      loves, predicted      loves\n",
      "Expected       rite, predicted       rith\n",
      "Expected        and, predicted        and\n",
      "Expected         in, predicted         is\n",
      "Expected       mine, predicted       ming\n",
      "Expected        own, predicted        ows\n",
      "Expected      loves, predicted      loves\n",
      "Expected   strength, predicted   strengts\n",
      "Expected       seem, predicted       seer\n",
      "Expected         to, predicted         to\n",
      "Expected      decay, predicted      decay\n",
      "Expected oercharged, predicted oercharged\n",
      "Expected       with, predicted       with\n",
      "Expected     burden, predicted     burder\n",
      "Expected         of, predicted         or\n",
      "Expected       mine, predicted       ming\n",
      "Expected        own, predicted        ows\n",
      "Expected      loves, predicted      loves\n",
      "Expected      might, predicted      might\n",
      "Expected          o, predicted          o\n",
      "Expected        let, predicted        let\n",
      "Expected         my, predicted         my\n",
      "Expected      books, predicted      bookd\n",
      "Expected         be, predicted         be\n",
      "Expected       then, predicted       thee\n",
      "Expected        the, predicted        the\n",
      "Expected  eloquence, predicted  eloquence\n",
      "Expected        and, predicted        and\n",
      "Expected       dumb, predicted       dumh\n",
      "Expected  presagers, predicted  presagers\n",
      "Expected         of, predicted         or\n",
      "Expected         my, predicted         my\n",
      "Expected   speaking, predicted   speaking\n",
      "Expected     breast, predicted     breast\n",
      "Expected        who, predicted        who\n",
      "Expected      plead, predicted      pleas\n",
      "Expected        for, predicted        for\n",
      "Expected       love, predicted       love\n",
      "Expected        and, predicted        and\n",
      "Expected       look, predicted       lood\n",
      "Expected        for, predicted        for\n",
      "Expected recompense, predicted recompenst\n",
      "Expected       more, predicted       more\n",
      "Expected       than, predicted       that\n",
      "Expected       that, predicted       that\n",
      "Expected     tongue, predicted     tongur\n",
      "Expected       that, predicted       that\n",
      "Expected       more, predicted       more\n",
      "Expected       hath, predicted       hate\n",
      "Expected       more, predicted       more\n",
      "Expected   expressd, predicted   expresse\n",
      "Expected          o, predicted          o\n",
      "Expected      learn, predicted      leard\n",
      "Expected         to, predicted         to\n",
      "Expected       read, predicted       rear\n",
      "Expected       what, predicted       what\n",
      "Expected     silent, predicted     silens\n",
      "Expected       love, predicted       love\n",
      "Expected       hath, predicted       hate\n",
      "Expected       writ, predicted       writ\n",
      "Expected         to, predicted         to\n",
      "Expected       hear, predicted       hear\n",
      "Expected       with, predicted       with\n",
      "Expected       eyes, predicted       eyes\n",
      "Expected    belongs, predicted    belonge\n",
      "Expected         to, predicted         to\n",
      "Expected      loves, predicted      loves\n",
      "Expected       fine, predicted       find\n",
      "Expected        wit, predicted        wit\n",
      "Expected       mine, predicted       ming\n",
      "Expected        eye, predicted        eye\n",
      "Expected       hath, predicted       hate\n",
      "Expected      playd, predicted      plays\n",
      "Expected        the, predicted        the\n",
      "Expected    painter, predicted    painted\n",
      "Expected        and, predicted        and\n",
      "Expected       hath, predicted       hate\n",
      "Expected     stelld, predicted     stells\n",
      "Expected        thy, predicted        the\n",
      "Expected    beautys, predicted    beautys\n",
      "Expected       form, predicted       fore\n",
      "Expected         in, predicted         is\n",
      "Expected      table, predicted      tabls\n",
      "Expected         of, predicted         or\n",
      "Expected         my, predicted         my\n",
      "Expected      heart, predicted      heard\n",
      "Expected         my, predicted         my\n",
      "Expected       body, predicted       bods\n",
      "Expected         is, predicted         is\n",
      "Expected        the, predicted        the\n",
      "Expected      frame, predicted      frame\n",
      "Expected    wherein, predicted    whereit\n",
      "Expected        tis, predicted        tit\n",
      "Expected       held, predicted       held\n",
      "Expected        and, predicted        and\n",
      "Expected erspective, predicted erspective\n",
      "Expected         it, predicted         is\n",
      "Expected         is, predicted         is\n",
      "Expected        the, predicted        the\n",
      "Expected   painters, predicted   painters\n",
      "Expected        art, predicted        art\n",
      "Expected        for, predicted        for\n",
      "Expected    through, predicted    througs\n",
      "Expected        the, predicted        the\n",
      "Expected    painter, predicted    painted\n",
      "Expected       must, predicted       must\n",
      "Expected        you, predicted        you\n",
      "Expected        see, predicted        see\n",
      "Expected        his, predicted        his\n",
      "Expected      skill, predicted      skill\n",
      "Expected         to, predicted         to\n",
      "Expected       find, predicted       find\n",
      "Expected      where, predicted      where\n",
      "Expected       your, predicted       your\n",
      "Expected       true, predicted       trut\n",
      "Expected      image, predicted      image\n",
      "Expected   pictured, predicted   pictures\n",
      "Expected       lies, predicted       lier\n",
      "Expected      which, predicted      which\n",
      "Expected         in, predicted         is\n",
      "Expected         my, predicted         my\n",
      "Expected     bosoms, predicted     bosomy\n",
      "Expected       shop, predicted       shos\n",
      "Expected         is, predicted         is\n",
      "Expected    hanging, predicted    hanging\n",
      "Expected      still, predicted      still\n",
      "Expected       that, predicted       that\n",
      "Expected       hath, predicted       hate\n",
      "Expected        his, predicted        his\n",
      "Expected    windows, predicted    windows\n",
      "Expected     glazed, predicted     glazes\n",
      "Expected       with, predicted       with\n",
      "Expected      thine, predicted      thine\n",
      "Expected       eyes, predicted       eyes\n",
      "Expected        now, predicted        not\n",
      "Expected        see, predicted        see\n",
      "Expected       what, predicted       what\n",
      "Expected       good, predicted       good\n",
      "Expected      turns, predicted      turns\n",
      "Expected       eyes, predicted       eyes\n",
      "Expected        for, predicted        for\n",
      "Expected       eyes, predicted       eyes\n",
      "Expected       have, predicted       have\n",
      "Expected       done, predicted       dong\n",
      "Expected       mine, predicted       ming\n",
      "Expected       eyes, predicted       eyes\n",
      "Expected       have, predicted       have\n",
      "Expected      drawn, predicted      drawd\n",
      "Expected        thy, predicted        the\n",
      "Expected      shape, predicted      shapy\n",
      "Expected        and, predicted        and\n",
      "Expected      thine, predicted      thine\n",
      "Expected        for, predicted        for\n",
      "Expected         me, predicted         my\n",
      "Expected        are, predicted        art\n",
      "Expected    windows, predicted    windows\n",
      "Expected         to, predicted         to\n",
      "Expected         my, predicted         my\n",
      "Expected     breast, predicted     breast\n",
      "Expected erethrough, predicted erethrougs\n",
      "Expected        the, predicted        the\n",
      "Expected        sun, predicted        suh\n",
      "Expected   delights, predicted   delights\n",
      "Expected         to, predicted         to\n",
      "Expected       peep, predicted       peer\n",
      "Expected         to, predicted         to\n",
      "Expected       gaze, predicted       gaze\n",
      "Expected    therein, predicted    thereit\n",
      "Expected         on, predicted         or\n",
      "Expected       thee, predicted       thee\n",
      "Expected        yet, predicted        yet\n",
      "Expected       eyes, predicted       eyes\n",
      "Expected       this, predicted       this\n",
      "Expected    cunning, predicted    cunning\n",
      "Expected       want, predicted       wand\n",
      "Expected         to, predicted         to\n",
      "Expected      grace, predicted      grace\n",
      "Expected      their, predicted      their\n",
      "Expected        art, predicted        art\n",
      "Expected       they, predicted       thee\n",
      "Expected       draw, predicted       dras\n",
      "Expected        but, predicted        but\n",
      "Expected       what, predicted       what\n",
      "Expected       they, predicted       thee\n",
      "Expected        see, predicted        see\n",
      "Expected       know, predicted       knot\n",
      "Expected        not, predicted        not\n",
      "Expected        the, predicted        the\n",
      "Expected      heart, predicted      heard\n",
      "Expected        let, predicted        let\n",
      "Expected      those, predicted      those\n",
      "Expected        who, predicted        who\n",
      "Expected        are, predicted        art\n",
      "Expected         in, predicted         is\n",
      "Expected     favour, predicted     favouh\n",
      "Expected       with, predicted       with\n",
      "Expected      their, predicted      their\n",
      "Expected      stars, predicted      start\n",
      "Expected         of, predicted         or\n",
      "Expected     public, predicted     publis\n",
      "Expected     honour, predicted     honour\n",
      "Expected        and, predicted        and\n",
      "Expected      proud, predicted      prous\n",
      "Expected     titles, predicted     titles\n",
      "Expected      boast, predicted      boass\n",
      "Expected     whilst, predicted     whilst\n",
      "Expected          i, predicted          o\n",
      "Expected       whom, predicted       whom\n",
      "Expected    fortune, predicted    fortune\n",
      "Expected         of, predicted         or\n",
      "Expected       such, predicted       such\n",
      "Expected    triumph, predicted    triumps\n",
      "Expected       bars, predicted       bard\n",
      "Expected    unlookd, predicted    unlookd\n",
      "Expected        for, predicted        for\n",
      "Expected        joy, predicted        jor\n",
      "Expected         in, predicted         is\n",
      "Expected       that, predicted       that\n",
      "Expected          i, predicted          o\n",
      "Expected     honour, predicted     honour\n",
      "Expected       most, predicted       most\n",
      "Expected      great, predicted      greaf\n",
      "Expected    princes, predicted    princes\n",
      "Expected favourites, predicted favourites\n",
      "Expected      their, predicted      their\n",
      "Expected       fair, predicted       fair\n",
      "Expected     leaves, predicted     leaves\n",
      "Expected     spread, predicted     spreaf\n",
      "Expected        but, predicted        but\n",
      "Expected         as, predicted         as\n",
      "Expected        the, predicted        the\n",
      "Expected   marigold, predicted   marigols\n",
      "Expected         at, predicted         as\n",
      "Expected        the, predicted        the\n",
      "Expected       suns, predicted       sung\n",
      "Expected        eye, predicted        eye\n",
      "Expected        and, predicted        and\n",
      "Expected         in, predicted         is\n",
      "Expected themselves, predicted themselves\n",
      "Expected      their, predicted      their\n",
      "Expected      pride, predicted      pride\n",
      "Expected       lies, predicted       lier\n",
      "Expected     buried, predicted     buried\n",
      "Expected        for, predicted        for\n",
      "Expected         at, predicted         as\n",
      "Expected          a, predicted          o\n",
      "Expected      frown, predicted      frowh\n",
      "Expected       they, predicted       thee\n",
      "Expected         in, predicted         is\n",
      "Expected      their, predicted      their\n",
      "Expected      glory, predicted      glors\n",
      "Expected        die, predicted        dit\n",
      "Expected        the, predicted        the\n",
      "Expected    painful, predicted    painfum\n",
      "Expected    warrior, predicted    warriog\n",
      "Expected   famoused, predicted   famouses\n",
      "Expected        for, predicted        for\n",
      "Expected      fight, predicted      fight\n",
      "Expected      after, predicted      afted\n",
      "Expected          a, predicted          o\n",
      "Expected   thousand, predicted   thousand\n",
      "Expected  victories, predicted  victorier\n",
      "Expected       once, predicted       once\n",
      "Expected      foild, predicted      foild\n",
      "Expected         is, predicted         is\n",
      "Expected       from, predicted       from\n",
      "Expected        the, predicted        the\n",
      "Expected       book, predicted       boor\n",
      "Expected         of, predicted         or\n",
      "Expected     honour, predicted     honour\n",
      "Expected      razed, predicted      razes\n",
      "Expected      quite, predicted      quity\n",
      "Expected        and, predicted        and\n",
      "Expected        all, predicted        all\n",
      "Expected        the, predicted        the\n",
      "Expected       rest, predicted       rest\n",
      "Expected     forgot, predicted     forgot\n",
      "Expected        for, predicted        for\n",
      "Expected      which, predicted      which\n",
      "Expected         he, predicted         he\n",
      "Expected      toild, predicted      toild\n",
      "Expected       then, predicted       thee\n",
      "Expected      happy, predicted      happy\n",
      "Expected          i, predicted          o\n",
      "Expected       that, predicted       that\n",
      "Expected       love, predicted       love\n",
      "Expected        and, predicted        and\n",
      "Expected         am, predicted         as\n",
      "Expected    beloved, predicted    beloved\n",
      "Expected      where, predicted      where\n",
      "Expected          i, predicted          o\n",
      "Expected        may, predicted        may\n",
      "Expected        not, predicted        not\n",
      "Expected     remove, predicted     remove\n",
      "Expected        nor, predicted        not\n",
      "Expected         be, predicted         be\n",
      "Expected    removed, predicted    removes\n",
      "Testing accuracy: 0.6230636833046471\n"
     ]
    }
   ],
   "source": [
    "testing_input = read_data(\"testing_input.txt\", encode_string, 9)\n",
    "testing_label = read_data(\"testing_label.txt\", one_hot_encode_character, 1)\n",
    "\n",
    "model.test(testing_input, testing_label, verbose=True)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "a = np.array([[1],[2],[3]])\n",
    "b = np.array([[1,2,3],[1,2,3],[1,2,3]])\n",
    "c = np.concatenate(a,b,axis=0)\n",
    "d = np.array([[1,2,3,2],[3,2,4,1]])"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": []
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.7.1"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 2
}
