{ "cells": [ { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "Igc5itf-xMGj" }, "source": [ "# Masakhane - Machine Translation for African Languages (Using JoeyNMT)" ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "x4fXCKCf36IK" }, "source": [ "## Note before beginning:\n", "### - The idea is that you should be able to make minimal changes to this in order to get SOME result for your own translation corpus. \n", "\n", "### - The tl;dr: Go to the **\"TODO\"** comments which will tell you what to update to get up and running\n", "\n", "### - If you actually want to have a clue what you're doing, read the text and peek at the links\n", "\n", "### - With 100 epochs, it should take around 7 hours to run in Google Colab\n", "\n", "### - Once you've gotten a result for your language, please attach and email your notebook that generated it to masakhanetranslation@gmail.com\n", "\n", "### - If you care enough and get a chance, doing a brief background on your language would be amazing. See examples in [(Martinus, 2019)](https://arxiv.org/abs/1906.05685)" ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "l929HimrxS0a" }, "source": [ "## Retrieve your data & make a parallel corpus\n", "\n", "If you are wanting to use the JW300 data referenced on the Masakhane website or in our GitHub repo, you can use `opus-tools` to convert the data into a convenient format. `opus_read` from that package provides a convenient tool for reading the native aligned XML files and to convert them to TMX format. The tool can also be used to fetch relevant files from OPUS on the fly and to filter the data as necessary. [Read the documentation](https://pypi.org/project/opustools-pkg/) for more details.\n", "\n", "Once you have your corpus files in TMX format (an xml structure which will include the sentences in your target language and your source language in a single file), we recommend reading them into a pandas dataframe. Thankfully, Jade wrote a silly `tmx2dataframe` package which converts your tmx file to a pandas dataframe. " ] }, { "cell_type": "code", "execution_count": 0, "metadata": { "colab": {}, "colab_type": "code", "id": "oGRmDELn7Az0" }, "outputs": [], "source": [ "\"\"\"from google.colab import drive\n", "drive.mount('/content/drive')\"\"\"" ] }, { "cell_type": "code", "execution_count": 1, "metadata": { "colab": {}, "colab_type": "code", "id": "Cn3tgQLzUxwn" }, "outputs": [], "source": [ "# TODO: Set your source and target languages. Keep in mind, these traditionally use language codes as found here:\n", "# These will also become the suffix's of all vocab and corpus files used throughout\n", "import os\n", "source_language = \"en\"\n", "target_language = \"pcm\" \n", "lc = False # If True, lowercase the data.\n", "seed = 42 # Random seed for shuffling.\n", "tag = \"jw300-baseline\" # Give a unique name to your folder - this is to ensure you don't rewrite any models you've already submitted\n", "\n", "os.environ[\"src\"] = source_language # Sets them in bash as well, since we often use bash scripts\n", "os.environ[\"tgt\"] = target_language\n", "os.environ[\"tag\"] = tag\n", "\n", "\n", "# This will save it to a folder in our gdrive instead!\n", "!mkdir -p \"en_pcm/$src-$tgt-$tag\"\n", "os.environ[\"experiment_path\"] = \"en_pcm/%s-%s-%s\" % (source_language, target_language, tag)" ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [], "source": [ "!mkdir -p \"en_pcm/$src-$tgt-$tag\"" ] }, { "cell_type": "code", "execution_count": 3, "metadata": { "colab": {}, "colab_type": "code", "id": "kBSgJHEw7Nvx" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "en_pcm/en-pcm-jw300-baseline\r\n" ] } ], "source": [ "!echo $experiment_path" ] }, { "cell_type": "code", "execution_count": 4, "metadata": { "colab": {}, "colab_type": "code", "id": "gA75Fs9ys8Y9" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Requirement already satisfied: opustools-pkg in /home/ec2-user/anaconda3/envs/python3/lib/python3.6/site-packages (0.0.52)\n", "\u001b[33mYou are using pip version 10.0.1, however version 20.0.2 is available.\n", "You should consider upgrading via the 'pip install --upgrade pip' command.\u001b[0m\n" ] } ], "source": [ "# Install opus-tools\n", "! pip install opustools-pkg" ] }, { "cell_type": "code", "execution_count": 5, "metadata": { "colab": {}, "colab_type": "code", "id": "xq-tDZVks7ZD" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "\n", "Alignment file /proj/nlpl/data/OPUS/JW300/latest/xml/en-pcm.xml.gz not found. The following files are available for downloading:\n", "\n", " 256 KB https://object.pouta.csc.fi/OPUS-JW300/v1/xml/en-pcm.xml.gz\n", " 263 MB https://object.pouta.csc.fi/OPUS-JW300/v1/xml/en.zip\n", " 3 MB https://object.pouta.csc.fi/OPUS-JW300/v1/xml/pcm.zip\n", "\n", " 266 MB Total size\n", "./JW300_latest_xml_en-pcm.xml.gz ... 100% of 256 KB\n", "./JW300_latest_xml_en.zip ... 100% of 263 MB\n", "./JW300_latest_xml_pcm.zip ... 100% of 3 MB\n", "gzip: unrecognized option '--y'\n", "Try `gzip --help' for more information.\n" ] } ], "source": [ "# Downloading our corpus\n", "! opus_read -d JW300 -s $src -t $tgt -wm moses -w jw300.$src jw300.$tgt -q \n", "\n", "# extract the corpus file\n", "! gunzip JW300_latest_xml_$src-$tgt.xml.gz --y" ] }, { "cell_type": "code", "execution_count": 6, "metadata": { "colab": {}, "colab_type": "code", "id": "n48GDRnP8y2G" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "--2020-02-09 12:37:41-- https://raw.githubusercontent.com/juliakreutzer/masakhane/master/jw300_utils/test/test.en-any.en\n", "Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 199.232.24.133\n", "Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|199.232.24.133|:443... connected.\n", "HTTP request sent, awaiting response... 200 OK\n", "Length: 277791 (271K) [text/plain]\n", "Saving to: ‘test.en-any.en’\n", "\n", "test.en-any.en 100%[===================>] 271.28K --.-KB/s in 0.008s \n", "\n", "2020-02-09 12:37:41 (35.2 MB/s) - ‘test.en-any.en’ saved [277791/277791]\n", "\n", "--2020-02-09 12:37:41-- https://raw.githubusercontent.com/juliakreutzer/masakhane/master/jw300_utils/test/test.en-pcm.en\n", "Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 199.232.24.133\n", "Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|199.232.24.133|:443... connected.\n", "HTTP request sent, awaiting response... 200 OK\n", "Length: 154369 (151K) [text/plain]\n", "Saving to: ‘test.en-pcm.en’\n", "\n", "test.en-pcm.en 100%[===================>] 150.75K --.-KB/s in 0.005s \n", "\n", "2020-02-09 12:37:42 (29.3 MB/s) - ‘test.en-pcm.en’ saved [154369/154369]\n", "\n", "--2020-02-09 12:37:42-- https://raw.githubusercontent.com/juliakreutzer/masakhane/master/jw300_utils/test/test.en-pcm.pcm\n", "Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 199.232.24.133\n", "Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|199.232.24.133|:443... connected.\n", "HTTP request sent, awaiting response... 200 OK\n", "Length: 163323 (159K) [text/plain]\n", "Saving to: ‘test.en-pcm.pcm’\n", "\n", "test.en-pcm.pcm 100%[===================>] 159.50K --.-KB/s in 0.005s \n", "\n", "2020-02-09 12:37:42 (33.7 MB/s) - ‘test.en-pcm.pcm’ saved [163323/163323]\n", "\n" ] } ], "source": [ "# Download the global test set.\n", "! wget https://raw.githubusercontent.com/juliakreutzer/masakhane/master/jw300_utils/test/test.en-any.en\n", " \n", "# And the specific test set for this language pair.\n", "os.environ[\"trg\"] = target_language \n", "os.environ[\"src\"] = source_language \n", "\n", "! wget https://raw.githubusercontent.com/juliakreutzer/masakhane/master/jw300_utils/test/test.en-$trg.en \n", "! mv test.en-$trg.en test.en\n", "! wget https://raw.githubusercontent.com/juliakreutzer/masakhane/master/jw300_utils/test/test.en-$trg.$trg \n", "! mv test.en-$trg.$trg test.$trg" ] }, { "cell_type": "code", "execution_count": 7, "metadata": { "colab": {}, "colab_type": "code", "id": "NqDG-CI28y2L" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Loaded 3571 global test sentences to filter from the training/dev data.\n" ] } ], "source": [ "# Read the test data to filter from train and dev splits.\n", "# Store english portion in set for quick filtering checks.\n", "en_test_sents = set()\n", "filter_test_sents = \"test.en-any.en\"\n", "j = 0\n", "with open(filter_test_sents) as f:\n", " for line in f:\n", " en_test_sents.add(line.strip())\n", " j += 1\n", "print('Loaded {} global test sentences to filter from the training/dev data.'.format(j))" ] }, { "cell_type": "code", "execution_count": 8, "metadata": { "colab": {}, "colab_type": "code", "id": "3CNdwLBCfSIl" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Loaded data and skipped 2536/26020 lines since contained in test set.\n" ] }, { "data": { "text/html": [ "
\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
source_sentencetarget_sentence
03 Settle Differences in a Spirit of Love3 Make Una Dey Use Love Settle Quarrel
1Because of our inherited imperfection , we are...Because of the sin wey all of us carry from be...
2This article shows how Bible principles can be...This topic go show us how we fit let the thing...
\n", "
" ], "text/plain": [ " source_sentence \\\n", "0 3 Settle Differences in a Spirit of Love \n", "1 Because of our inherited imperfection , we are... \n", "2 This article shows how Bible principles can be... \n", "\n", " target_sentence \n", "0 3 Make Una Dey Use Love Settle Quarrel \n", "1 Because of the sin wey all of us carry from be... \n", "2 This topic go show us how we fit let the thing... " ] }, "execution_count": 8, "metadata": {}, "output_type": "execute_result" } ], "source": [ "import pandas as pd\n", "\n", "# TMX file to dataframe\n", "source_file = 'jw300.' + source_language\n", "target_file = 'jw300.' + target_language\n", "\n", "source = []\n", "target = []\n", "skip_lines = [] # Collect the line numbers of the source portion to skip the same lines for the target portion.\n", "with open(source_file) as f:\n", " for i, line in enumerate(f):\n", " # Skip sentences that are contained in the test set.\n", " if line.strip() not in en_test_sents:\n", " source.append(line.strip())\n", " else:\n", " skip_lines.append(i) \n", "with open(target_file) as f:\n", " for j, line in enumerate(f):\n", " # Only add to corpus if corresponding source was not skipped.\n", " if j not in skip_lines:\n", " target.append(line.strip())\n", " \n", "print('Loaded data and skipped {}/{} lines since contained in test set.'.format(len(skip_lines), i))\n", " \n", "df = pd.DataFrame(zip(source, target), columns=['source_sentence', 'target_sentence'])\n", "# if you get TypeError: data argument can't be an iterator is because of your zip version run this below\n", "#df = pd.DataFrame(list(zip(source, target)), columns=['source_sentence', 'target_sentence'])\n", "df.head(3)" ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "YkuK3B4p2AkN" }, "source": [ "## Pre-processing and export\n", "\n", "It is generally a good idea to remove duplicate translations and conflicting translations from the corpus. In practice, these public corpora include some number of these that need to be cleaned.\n", "\n", "In addition we will split our data into dev/test/train and export to the filesystem." ] }, { "cell_type": "code", "execution_count": 9, "metadata": { "colab": {}, "colab_type": "code", "id": "M_2ouEOH1_1q" }, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "/home/ec2-user/anaconda3/envs/python3/lib/python3.6/site-packages/ipykernel/__main__.py:7: SettingWithCopyWarning: \n", "A value is trying to be set on a copy of a slice from a DataFrame\n", "\n", "See the caveats in the documentation: http://pandas.pydata.org/pandas-docs/stable/indexing.html#indexing-view-versus-copy\n", "/home/ec2-user/anaconda3/envs/python3/lib/python3.6/site-packages/ipykernel/__main__.py:8: SettingWithCopyWarning: \n", "A value is trying to be set on a copy of a slice from a DataFrame\n", "\n", "See the caveats in the documentation: http://pandas.pydata.org/pandas-docs/stable/indexing.html#indexing-view-versus-copy\n" ] } ], "source": [ "# drop duplicate translations\n", "df_pp = df.drop_duplicates()\n", "\n", "# drop conflicting translations\n", "# (this is optional and something that you might want to comment out \n", "# depending on the size of your corpus)\n", "df_pp.drop_duplicates(subset='source_sentence', inplace=True)\n", "df_pp.drop_duplicates(subset='target_sentence', inplace=True)\n", "\n", "# Shuffle the data to remove bias in dev set selection.\n", "df_pp = df_pp.sample(frac=1, random_state=seed).reset_index(drop=True)" ] }, { "cell_type": "code", "execution_count": 10, "metadata": { "colab": {}, "colab_type": "code", "id": "Z_1BwAApEtMk" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Collecting fuzzywuzzy\n", " Downloading https://files.pythonhosted.org/packages/d8/f1/5a267addb30ab7eaa1beab2b9323073815da4551076554ecc890a3595ec9/fuzzywuzzy-0.17.0-py2.py3-none-any.whl\n", "Installing collected packages: fuzzywuzzy\n", "Successfully installed fuzzywuzzy-0.17.0\n", "\u001b[33mYou are using pip version 10.0.1, however version 20.0.2 is available.\n", "You should consider upgrading via the 'pip install --upgrade pip' command.\u001b[0m\n", "Collecting python-Levenshtein\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/42/a9/d1785c85ebf9b7dfacd08938dd028209c34a0ea3b1bcdb895208bd40a67d/python-Levenshtein-0.12.0.tar.gz (48kB)\n", "\u001b[K 100% |████████████████████████████████| 51kB 14.8MB/s ta 0:00:01\n", "\u001b[?25hRequirement already satisfied: setuptools in /home/ec2-user/anaconda3/envs/python3/lib/python3.6/site-packages (from python-Levenshtein) (39.1.0)\n", "Building wheels for collected packages: python-Levenshtein\n", " Running setup.py bdist_wheel for python-Levenshtein ... \u001b[?25ldone\n", "\u001b[?25h Stored in directory: /home/ec2-user/.cache/pip/wheels/de/c2/93/660fd5f7559049268ad2dc6d81c4e39e9e36518766eaf7e342\n", "Successfully built python-Levenshtein\n", "Installing collected packages: python-Levenshtein\n", "Successfully installed python-Levenshtein-0.12.0\n", "\u001b[33mYou are using pip version 10.0.1, however version 20.0.2 is available.\n", "You should consider upgrading via the 'pip install --upgrade pip' command.\u001b[0m\n", "00:00:00.05 0.00 percent complete\n", "00:00:26.30 4.70 percent complete\n", "00:00:52.73 9.40 percent complete\n", "00:01:19.04 14.10 percent complete\n", "00:01:45.75 18.79 percent complete\n", "00:02:11.71 23.49 percent complete\n", "00:02:38.52 28.19 percent complete\n", "00:03:04.38 32.89 percent complete\n", "00:03:31.22 37.59 percent complete\n", "00:03:56.93 42.29 percent complete\n", "00:04:23.57 46.99 percent complete\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '']\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "00:04:50.25 51.68 percent complete\n", "00:05:15.71 56.38 percent complete\n", "00:05:41.33 61.08 percent complete\n", "00:06:07.34 65.78 percent complete\n", "00:06:33.58 70.48 percent complete\n", "00:06:59.31 75.18 percent complete\n", "00:07:26.38 79.88 percent complete\n", "00:07:53.15 84.57 percent complete\n", "00:08:19.62 89.27 percent complete\n", "00:08:45.67 93.97 percent complete\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '*']\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "00:09:11.83 98.67 percent complete\n" ] } ], "source": [ "# Install fuzzy wuzzy to remove \"almost duplicate\" sentences in the\n", "# test and training sets.\n", "! pip install fuzzywuzzy\n", "! pip install python-Levenshtein\n", "import time\n", "from fuzzywuzzy import process\n", "import numpy as np\n", "\n", "# reset the index of the training set after previous filtering\n", "df_pp.reset_index(drop=False, inplace=True)\n", "\n", "# Remove samples from the training data set if they \"almost overlap\" with the\n", "# samples in the test set.\n", "\n", "# Filtering function. Adjust pad to narrow down the candidate matches to\n", "# within a certain length of characters of the given sample.\n", "def fuzzfilter(sample, candidates, pad):\n", " candidates = [x for x in candidates if len(x) <= len(sample)+pad and len(x) >= len(sample)-pad] \n", " if len(candidates) > 0:\n", " return process.extractOne(sample, candidates)[1]\n", " else:\n", " return np.nan\n", "\n", "# NOTE - This might run slow depending on the size of your training set. We are\n", "# printing some information to help you track how long it would take. \n", "scores = []\n", "start_time = time.time()\n", "for idx, row in df_pp.iterrows():\n", " scores.append(fuzzfilter(row['source_sentence'], list(en_test_sents), 5))\n", " if idx % 1000 == 0:\n", " hours, rem = divmod(time.time() - start_time, 3600)\n", " minutes, seconds = divmod(rem, 60)\n", " print(\"{:0>2}:{:0>2}:{:05.2f}\".format(int(hours),int(minutes),seconds), \"%0.2f percent complete\" % (100.0*float(idx)/float(len(df_pp))))\n", "\n", "# Filter out \"almost overlapping samples\"\n", "df_pp['scores'] = scores\n", "df_pp = df_pp[df_pp['scores'] < 95]" ] }, { "cell_type": "code", "execution_count": 11, "metadata": { "colab": {}, "colab_type": "code", "id": "hxxBOCA-xXhy" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "==> train.en <==\n", "JEHOVAH’S servants highly esteem God’s own holy book , the Bible .\n", "At noon , they slowly walk home .\n", "To put the woman at ease , Jesus kindly said : “ Take courage , daughter ! ”\n", "( Compare Ezekiel 28 : 17 . )\n", "God’s active force is a very powerful source of comfort .\n", "What can we do to make wise use of our freedom ?\n", "Are you young or up in years ?\n", "“ People in our area are becoming more and more radical , ” notes one traveling overseer .\n", "This set Adam apart from the animals , since they live according to instinct .\n", "This , of course , was no accident .\n", "\n", "==> train.pcm <==\n", "JEHOVAH people value Bible well well because dem know sey na God holy book .\n", "Around twelve o’clock , they go waka sofri - sofri go house .\n", "Jesus come tell am wetin go make am no fear again . E tell am sey : ‘ No fear , my pikin ! ’\n", "( Still check Ezekiel 28 : 17 . )\n", "This one na one way wey e take dey help us .\n", "Wetin go help us use our freedom well ?\n", "You dey young or you don old ?\n", "One circuit overseer talk say more people for e area want make things change .\n", "This one make Adam different from animals , because animal life na one way .\n", "But na from small pikin Daniel start to do wetin good for God eye .\n", "==> dev.en <==\n", "Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "This faithful brother suffered repeated hardships .\n", "You will soon see how Jehovah “ cares for you . ”\n", "We all enjoy attending our Christian meetings as well as assemblies and conventions .\n", "Have complete confidence in Jehovah’s promise to watch over you , to give you “ the power beyond what is normal , ” and to help you withstand any attempts to frighten you into submission . ​ — 2 Cor .\n", "Parents , help your children to let their light shine by teaching them to comment in their own words .\n", "Felisa : Some time later , I got married and went to live in Cantabria , a province in Spain .\n", "\n", "==> dev.pcm <==\n", "E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "Dem suffer this brother well well because e no gree leave Jehovah .\n", "Na that time you go see sey Jehovah get you for mind .\n", "E dey always sweet us to go meeting for Kingdom Hall , assembly , or convention .\n", "( Luke 12 : 4 ) Use all your mind believe sey Jehovah go protect you . E go give you ‘ power wey pass normal power , ’ and help you stand well make you no enter Satan trap . ​ — 2 Cor .\n", "Papa and mama , make wuna teach wuna children make they try do pass just to read answer from book , or talk wetin person tell them . If you do this one , you de help them make their light shine .\n", "Felisa : After some time , I come marry and I come dey stay for Cantabaria , one town wey dey Spain .\n" ] } ], "source": [ "# This section does the split between train/dev for the parallel corpora then saves them as separate files\n", "# We use 1000 dev test and the given test set.\n", "import csv\n", "\n", "# Do the split between dev/train and create parallel corpora\n", "num_dev_patterns = 1000\n", "\n", "# Optional: lower case the corpora - this will make it easier to generalize, but without proper casing.\n", "if lc: # Julia: making lowercasing optional\n", " df_pp[\"source_sentence\"] = df_pp[\"source_sentence\"].str.lower()\n", " df_pp[\"target_sentence\"] = df_pp[\"target_sentence\"].str.lower()\n", "\n", "# Julia: test sets are already generated\n", "dev = df_pp.tail(num_dev_patterns) # Herman: Error in original\n", "stripped = df_pp.drop(df_pp.tail(num_dev_patterns).index)\n", "\n", "with open(\"train.\"+source_language, \"w\") as src_file, open(\"train.\"+target_language, \"w\") as trg_file:\n", " for index, row in stripped.iterrows():\n", " src_file.write(row[\"source_sentence\"]+\"\\n\")\n", " trg_file.write(row[\"target_sentence\"]+\"\\n\")\n", " \n", "with open(\"dev.\"+source_language, \"w\") as src_file, open(\"dev.\"+target_language, \"w\") as trg_file:\n", " for index, row in dev.iterrows():\n", " src_file.write(row[\"source_sentence\"]+\"\\n\")\n", " trg_file.write(row[\"target_sentence\"]+\"\\n\")\n", "\n", "#stripped[[\"source_sentence\"]].to_csv(\"train.\"+source_language, header=False, index=False) # Herman: Added `header=False` everywhere\n", "#stripped[[\"target_sentence\"]].to_csv(\"train.\"+target_language, header=False, index=False) # Julia: Problematic handling of quotation marks.\n", "\n", "#dev[[\"source_sentence\"]].to_csv(\"dev.\"+source_language, header=False, index=False)\n", "#dev[[\"target_sentence\"]].to_csv(\"dev.\"+target_language, header=False, index=False)\n", "\n", "# Doublecheck the format below. There should be no extra quotation marks or weird characters.\n", "! head train.*\n", "! head dev.*" ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "epeCydmCyS8X" }, "source": [ "\n", "\n", "---\n", "\n", "\n", "## Installation of JoeyNMT\n", "\n", "JoeyNMT is a simple, minimalist NMT package which is useful for learning and teaching. Check out the documentation for JoeyNMT [here](https://joeynmt.readthedocs.io) " ] }, { "cell_type": "code", "execution_count": 12, "metadata": { "colab": {}, "colab_type": "code", "id": "iBRMm4kMxZ8L", "scrolled": true }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Cloning into 'joeynmt'...\n", "remote: Enumerating objects: 93, done.\u001b[K\n", "remote: Counting objects: 100% (93/93), done.\u001b[K\n", "remote: Compressing objects: 100% (67/67), done.\u001b[K\n", "remote: Total 2277 (delta 56), reused 47 (delta 26), pack-reused 2184\u001b[K\n", "Receiving objects: 100% (2277/2277), 2.63 MiB | 5.06 MiB/s, done.\n", "Resolving deltas: 100% (1577/1577), done.\n", "Processing /home/ec2-user/SageMaker/masakhane/joeynmt\n", "Requirement already satisfied: future in /usr/lib/python3.6/dist-packages (from joeynmt==0.0.1) (0.17.1)\n", "Requirement already satisfied: pillow in /usr/lib64/python3.6/dist-packages (from joeynmt==0.0.1) (6.2.1)\n", "Requirement already satisfied: numpy<2.0,>=1.14.5 in /usr/lib64/python3.6/dist-packages (from joeynmt==0.0.1) (1.17.4)\n", "Requirement already satisfied: setuptools>=41.0.0 in /usr/lib/python3.6/dist-packages (from joeynmt==0.0.1) (41.6.0)\n", "Collecting torch>=1.1 (from joeynmt==0.0.1)\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/24/19/4804aea17cd136f1705a5e98a00618cb8f6ccc375ad8bfa437408e09d058/torch-1.4.0-cp36-cp36m-manylinux1_x86_64.whl (753.4MB)\n", "\u001b[K 100% |████████████████████████████████| 753.4MB 66kB/s eta 0:00:01 26% |████████▌ | 200.0MB 56.3MB/s eta 0:00:10 35% |███████████▎ | 264.9MB 58.5MB/s eta 0:00:09 50% |████████████████▏ | 380.8MB 57.0MB/s eta 0:00:07 66% |█████████████████████▏ | 498.3MB 55.6MB/s eta 0:00:05 72% |███████████████████████▎ | 547.4MB 59.3MB/s eta 0:00:04 90% |█████████████████████████████ | 681.6MB 58.1MB/s eta 0:00:02\n", "\u001b[?25hCollecting tensorflow>=1.14 (from joeynmt==0.0.1)\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/85/d4/c0cd1057b331bc38b65478302114194bd8e1b9c2bbc06e300935c0e93d90/tensorflow-2.1.0-cp36-cp36m-manylinux2010_x86_64.whl (421.8MB)\n", "\u001b[K 100% |████████████████████████████████| 421.8MB 117kB/s eta 0:00:010:09 7% |██▍ | 30.9MB 52.5MB/s eta 0:00:08 72% |███████████████████████▏ | 305.8MB 49.2MB/s eta 0:00:03\n", "\u001b[?25hCollecting torchtext (from joeynmt==0.0.1)\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/79/ef/54b8da26f37787f5c670ae2199329e7dccf195c060b25628d99e587dac51/torchtext-0.5.0-py3-none-any.whl (73kB)\n", "\u001b[K 100% |████████████████████████████████| 81kB 35.2MB/s ta 0:00:01\n", "\u001b[?25hCollecting sacrebleu>=1.3.6 (from joeynmt==0.0.1)\n", " Downloading https://files.pythonhosted.org/packages/45/31/1a135b964c169984b27fb2f7a50280fa7f8e6d9d404d8a9e596180487fd1/sacrebleu-1.4.3-py3-none-any.whl\n", "Collecting subword-nmt (from joeynmt==0.0.1)\n", " Downloading https://files.pythonhosted.org/packages/74/60/6600a7bc09e7ab38bc53a48a20d8cae49b837f93f5842a41fe513a694912/subword_nmt-0.3.7-py2.py3-none-any.whl\n", "Requirement already satisfied: matplotlib in /usr/lib64/python3.6/dist-packages (from joeynmt==0.0.1) (3.1.1)\n", "Collecting seaborn (from joeynmt==0.0.1)\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/70/bd/5e6bf595fe6ee0f257ae49336dd180768c1ed3d7c7155b2fdf894c1c808a/seaborn-0.10.0-py3-none-any.whl (215kB)\n", "\u001b[K 100% |████████████████████████████████| 225kB 39.8MB/s ta 0:00:01\n", "\u001b[?25hCollecting pyyaml>=5.1 (from joeynmt==0.0.1)\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/3d/d9/ea9816aea31beeadccd03f1f8b625ecf8f645bd66744484d162d84803ce5/PyYAML-5.3.tar.gz (268kB)\n", "\u001b[K 100% |████████████████████████████████| 276kB 35.7MB/s ta 0:00:01\n", "\u001b[?25hRequirement already satisfied: pylint in /usr/lib/python3.6/dist-packages (from joeynmt==0.0.1) (1.9.4)\n", "Collecting six==1.12 (from joeynmt==0.0.1)\n", " Downloading https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl\n", "Collecting grpcio>=1.8.6 (from tensorflow>=1.14->joeynmt==0.0.1)\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/45/6d/ee60b05e5e6df8d27561bcb2579b4b2e4004561be7c843768359e005093b/grpcio-1.27.1-cp36-cp36m-manylinux2010_x86_64.whl (2.7MB)\n", "\u001b[K 100% |████████████████████████████████| 2.7MB 13.4MB/s ta 0:00:01\n", "\u001b[?25hCollecting scipy==1.4.1; python_version >= \"3\" (from tensorflow>=1.14->joeynmt==0.0.1)\n", " Using cached https://files.pythonhosted.org/packages/dc/29/162476fd44203116e7980cfbd9352eef9db37c49445d1fec35509022f6aa/scipy-1.4.1-cp36-cp36m-manylinux1_x86_64.whl\n", "Collecting astor>=0.6.0 (from tensorflow>=1.14->joeynmt==0.0.1)\n", " Downloading https://files.pythonhosted.org/packages/c3/88/97eef84f48fa04fbd6750e62dcceafba6c63c81b7ac1420856c8dcc0a3f9/astor-0.8.1-py2.py3-none-any.whl\n", "Collecting opt-einsum>=2.3.2 (from tensorflow>=1.14->joeynmt==0.0.1)\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/b8/83/755bd5324777875e9dff19c2e59daec837d0378c09196634524a3d7269ac/opt_einsum-3.1.0.tar.gz (69kB)\n", "\u001b[K 100% |████████████████████████████████| 71kB 32.6MB/s ta 0:00:01\n", "\u001b[?25hCollecting google-pasta>=0.1.6 (from tensorflow>=1.14->joeynmt==0.0.1)\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/c3/fd/1e86bc4837cc9a3a5faf3db9b1854aa04ad35b5f381f9648fbe81a6f94e4/google_pasta-0.1.8-py3-none-any.whl (57kB)\n", "\u001b[K 100% |████████████████████████████████| 61kB 29.4MB/s ta 0:00:01\n", "\u001b[?25hCollecting keras-applications>=1.0.8 (from tensorflow>=1.14->joeynmt==0.0.1)\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/71/e3/19762fdfc62877ae9102edf6342d71b28fbfd9dea3d2f96a882ce099b03f/Keras_Applications-1.0.8-py3-none-any.whl (50kB)\n", "\u001b[K 100% |████████████████████████████████| 51kB 27.5MB/s ta 0:00:01\n", "\u001b[?25hCollecting tensorboard<2.2.0,>=2.1.0 (from tensorflow>=1.14->joeynmt==0.0.1)\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/40/23/53ffe290341cd0855d595b0a2e7485932f473798af173bbe3a584b99bb06/tensorboard-2.1.0-py3-none-any.whl (3.8MB)\n", "\u001b[K 100% |████████████████████████████████| 3.8MB 10.4MB/s ta 0:00:01\n", "\u001b[?25hCollecting tensorflow-estimator<2.2.0,>=2.1.0rc0 (from tensorflow>=1.14->joeynmt==0.0.1)\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/18/90/b77c328a1304437ab1310b463e533fa7689f4bfc41549593056d812fab8e/tensorflow_estimator-2.1.0-py2.py3-none-any.whl (448kB)\n", "\u001b[K 100% |████████████████████████████████| 450kB 36.8MB/s ta 0:00:01\n", "\u001b[?25hCollecting keras-preprocessing>=1.1.0 (from tensorflow>=1.14->joeynmt==0.0.1)\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/28/6a/8c1f62c37212d9fc441a7e26736df51ce6f0e38455816445471f10da4f0a/Keras_Preprocessing-1.1.0-py2.py3-none-any.whl (41kB)\n", "\u001b[K 100% |████████████████████████████████| 51kB 29.4MB/s ta 0:00:01\n", "\u001b[?25hRequirement already satisfied: protobuf>=3.8.0 in /usr/lib64/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (3.10.0)\n", "Collecting gast==0.2.2 (from tensorflow>=1.14->joeynmt==0.0.1)\n", " Downloading https://files.pythonhosted.org/packages/4e/35/11749bf99b2d4e3cceb4d55ca22590b0d7c2c62b9de38ac4a4a7f4687421/gast-0.2.2.tar.gz\n", "Collecting absl-py>=0.7.0 (from tensorflow>=1.14->joeynmt==0.0.1)\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/1a/53/9243c600e047bd4c3df9e69cfabc1e8004a82cac2e0c484580a78a94ba2a/absl-py-0.9.0.tar.gz (104kB)\n", "\u001b[K 100% |████████████████████████████████| 112kB 39.3MB/s ta 0:00:01\n", "\u001b[?25hRequirement already satisfied: wheel>=0.26; python_version >= \"3\" in /usr/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (0.32.3)\n", "Requirement already satisfied: wrapt>=1.11.1 in /usr/lib64/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (1.11.2)\n", "Collecting termcolor>=1.1.0 (from tensorflow>=1.14->joeynmt==0.0.1)\n", " Downloading https://files.pythonhosted.org/packages/8a/48/a76be51647d0eb9f10e2a4511bf3ffb8cc1e6b14e9e4fab46173aa79f981/termcolor-1.1.0.tar.gz\n", "Collecting tqdm (from torchtext->joeynmt==0.0.1)\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/cd/80/5bb262050dd2f30f8819626b7c92339708fe2ed7bd5554c8193b4487b367/tqdm-4.42.1-py2.py3-none-any.whl (59kB)\n", "\u001b[K 100% |████████████████████████████████| 61kB 28.1MB/s ta 0:00:01\n", "\u001b[?25hCollecting sentencepiece (from torchtext->joeynmt==0.0.1)\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/74/f4/2d5214cbf13d06e7cb2c20d84115ca25b53ea76fa1f0ade0e3c9749de214/sentencepiece-0.1.85-cp36-cp36m-manylinux1_x86_64.whl (1.0MB)\n", "\u001b[K 100% |████████████████████████████████| 1.0MB 26.1MB/s ta 0:00:01\n", "\u001b[?25hRequirement already satisfied: requests in /usr/lib/python3.6/dist-packages (from torchtext->joeynmt==0.0.1) (2.20.0)\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Collecting portalocker (from sacrebleu>=1.3.6->joeynmt==0.0.1)\n", " Downloading https://files.pythonhosted.org/packages/91/db/7bc703c0760df726839e0699b7f78a4d8217fdc9c7fcb1b51b39c5a22a4e/portalocker-1.5.2-py2.py3-none-any.whl\n", "Collecting typing (from sacrebleu>=1.3.6->joeynmt==0.0.1)\n", " Downloading https://files.pythonhosted.org/packages/fe/2e/b480ee1b75e6d17d2993738670e75c1feeb9ff7f64452153cf018051cc92/typing-3.7.4.1-py3-none-any.whl\n", "Requirement already satisfied: kiwisolver>=1.0.1 in /usr/lib64/python3.6/dist-packages (from matplotlib->joeynmt==0.0.1) (1.1.0)\n", "Requirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /usr/lib/python3.6/dist-packages (from matplotlib->joeynmt==0.0.1) (2.4.5)\n", "Requirement already satisfied: python-dateutil>=2.1 in /usr/lib/python3.6/dist-packages (from matplotlib->joeynmt==0.0.1) (2.8.1)\n", "Requirement already satisfied: cycler>=0.10 in /usr/lib/python3.6/dist-packages (from matplotlib->joeynmt==0.0.1) (0.10.0)\n", "Requirement already satisfied: pandas>=0.22.0 in /usr/lib64/python3.6/dist-packages (from seaborn->joeynmt==0.0.1) (0.24.2)\n", "Requirement already satisfied: mccabe in /usr/lib/python3.6/dist-packages (from pylint->joeynmt==0.0.1) (0.6.1)\n", "Requirement already satisfied: astroid<2.0,>=1.6 in /usr/lib/python3.6/dist-packages (from pylint->joeynmt==0.0.1) (1.6.6)\n", "Requirement already satisfied: isort>=4.2.5 in /usr/lib/python3.6/dist-packages (from pylint->joeynmt==0.0.1) (4.3.21)\n", "Collecting h5py (from keras-applications>=1.0.8->tensorflow>=1.14->joeynmt==0.0.1)\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/60/06/cafdd44889200e5438b897388f3075b52a8ef01f28a17366d91de0fa2d05/h5py-2.10.0-cp36-cp36m-manylinux1_x86_64.whl (2.9MB)\n", "\u001b[K 100% |████████████████████████████████| 2.9MB 16.4MB/s ta 0:00:01\n", "\u001b[?25hCollecting werkzeug>=0.11.15 (from tensorboard<2.2.0,>=2.1.0->tensorflow>=1.14->joeynmt==0.0.1)\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/ba/a5/d6f8a6e71f15364d35678a4ec8a0186f980b3bd2545f40ad51dd26a87fb1/Werkzeug-1.0.0-py2.py3-none-any.whl (298kB)\n", "\u001b[K 100% |████████████████████████████████| 307kB 39.7MB/s ta 0:00:01\n", "\u001b[?25hCollecting google-auth-oauthlib<0.5,>=0.4.1 (from tensorboard<2.2.0,>=2.1.0->tensorflow>=1.14->joeynmt==0.0.1)\n", " Downloading https://files.pythonhosted.org/packages/7b/b8/88def36e74bee9fce511c9519571f4e485e890093ab7442284f4ffaef60b/google_auth_oauthlib-0.4.1-py2.py3-none-any.whl\n", "Collecting google-auth<2,>=1.6.3 (from tensorboard<2.2.0,>=2.1.0->tensorflow>=1.14->joeynmt==0.0.1)\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/1c/6d/7aae38a9022f982cf8167775c7fc299f203417b698c27080ce09060bba07/google_auth-1.11.0-py2.py3-none-any.whl (76kB)\n", "\u001b[K 100% |████████████████████████████████| 81kB 36.3MB/s ta 0:00:01\n", "\u001b[?25hCollecting markdown>=2.6.8 (from tensorboard<2.2.0,>=2.1.0->tensorflow>=1.14->joeynmt==0.0.1)\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/d5/16/c5a68ef8c62406b3bbd8f49199bbae56feb390746a284c4cf036c687465f/Markdown-3.2-py2.py3-none-any.whl (88kB)\n", "\u001b[K 100% |████████████████████████████████| 92kB 37.0MB/s ta 0:00:01\n", "\u001b[?25hRequirement already satisfied: idna<2.8,>=2.5 in /usr/lib/python3.6/dist-packages (from requests->torchtext->joeynmt==0.0.1) (2.7)\n", "Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /usr/lib/python3.6/dist-packages (from requests->torchtext->joeynmt==0.0.1) (3.0.4)\n", "Requirement already satisfied: urllib3<1.25,>=1.21.1 in /usr/lib/python3.6/dist-packages (from requests->torchtext->joeynmt==0.0.1) (1.24.3)\n", "Requirement already satisfied: certifi>=2017.4.17 in /usr/lib/python3.6/dist-packages (from requests->torchtext->joeynmt==0.0.1) (2019.9.11)\n", "Requirement already satisfied: pytz>=2011k in /usr/lib/python3.6/dist-packages (from pandas>=0.22.0->seaborn->joeynmt==0.0.1) (2019.3)\n", "Requirement already satisfied: lazy-object-proxy in /usr/lib64/python3.6/dist-packages (from astroid<2.0,>=1.6->pylint->joeynmt==0.0.1) (1.4.3)\n", "Collecting requests-oauthlib>=0.7.0 (from google-auth-oauthlib<0.5,>=0.4.1->tensorboard<2.2.0,>=2.1.0->tensorflow>=1.14->joeynmt==0.0.1)\n", " Downloading https://files.pythonhosted.org/packages/a3/12/b92740d845ab62ea4edf04d2f4164d82532b5a0b03836d4d4e71c6f3d379/requests_oauthlib-1.3.0-py2.py3-none-any.whl\n", "Requirement already satisfied: rsa<4.1,>=3.1.4 in /usr/lib/python3.6/dist-packages (from google-auth<2,>=1.6.3->tensorboard<2.2.0,>=2.1.0->tensorflow>=1.14->joeynmt==0.0.1) (3.4.2)\n", "Collecting cachetools<5.0,>=2.0.0 (from google-auth<2,>=1.6.3->tensorboard<2.2.0,>=2.1.0->tensorflow>=1.14->joeynmt==0.0.1)\n", " Downloading https://files.pythonhosted.org/packages/08/6a/abf83cb951617793fd49c98cb9456860f5df66ff89883c8660aa0672d425/cachetools-4.0.0-py3-none-any.whl\n", "Collecting pyasn1-modules>=0.2.1 (from google-auth<2,>=1.6.3->tensorboard<2.2.0,>=2.1.0->tensorflow>=1.14->joeynmt==0.0.1)\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/95/de/214830a981892a3e286c3794f41ae67a4495df1108c3da8a9f62159b9a9d/pyasn1_modules-0.2.8-py2.py3-none-any.whl (155kB)\n", "\u001b[K 100% |████████████████████████████████| 163kB 40.1MB/s ta 0:00:01\n", "\u001b[?25hCollecting oauthlib>=3.0.0 (from requests-oauthlib>=0.7.0->google-auth-oauthlib<0.5,>=0.4.1->tensorboard<2.2.0,>=2.1.0->tensorflow>=1.14->joeynmt==0.0.1)\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/05/57/ce2e7a8fa7c0afb54a0581b14a65b56e62b5759dbc98e80627142b8a3704/oauthlib-3.1.0-py2.py3-none-any.whl (147kB)\n", "\u001b[K 100% |████████████████████████████████| 153kB 40.4MB/s ta 0:00:01\n", "\u001b[?25hRequirement already satisfied: pyasn1>=0.1.3 in /usr/lib/python3.6/dist-packages (from rsa<4.1,>=3.1.4->google-auth<2,>=1.6.3->tensorboard<2.2.0,>=2.1.0->tensorflow>=1.14->joeynmt==0.0.1) (0.4.8)\n", "Building wheels for collected packages: joeynmt, pyyaml, opt-einsum, gast, absl-py, termcolor\n", " Building wheel for joeynmt (setup.py) ... \u001b[?25ldone\n", "\u001b[?25h Stored in directory: /tmp/pip-ephem-wheel-cache-tl77crle/wheels/3a/30/44/d8a7e4bf0ed7c54548bf491738b185ef57222ad3bcdfb1afc4\n", " Building wheel for pyyaml (setup.py) ... \u001b[?25ldone\n", "\u001b[?25h Stored in directory: /home/ec2-user/.cache/pip/wheels/e4/76/4d/a95b8dd7b452b69e8ed4f68b69e1b55e12c9c9624dd962b191\n", " Building wheel for opt-einsum (setup.py) ... \u001b[?25ldone\n", "\u001b[?25h Stored in directory: /home/ec2-user/.cache/pip/wheels/2c/b1/94/43d03e130b929aae7ba3f8d15cbd7bc0d1cb5bb38a5c721833\n", " Building wheel for gast (setup.py) ... \u001b[?25ldone\n", "\u001b[?25h Stored in directory: /home/ec2-user/.cache/pip/wheels/5c/2e/7e/a1d4d4fcebe6c381f378ce7743a3ced3699feb89bcfbdadadd\n", " Building wheel for absl-py (setup.py) ... \u001b[?25ldone\n", "\u001b[?25h Stored in directory: /home/ec2-user/.cache/pip/wheels/8e/28/49/fad4e7f0b9a1227708cbbee4487ac8558a7334849cb81c813d\n", " Building wheel for termcolor (setup.py) ... \u001b[?25ldone\n", "\u001b[?25h Stored in directory: /home/ec2-user/.cache/pip/wheels/7c/06/54/bc84598ba1daf8f970247f550b175aaaee85f68b4b0c5ab2c6\n", "Successfully built joeynmt pyyaml opt-einsum gast absl-py termcolor\n", "\u001b[31mcoremltools 2.0 has requirement six==1.10.0, but you'll have six 1.12.0 which is incompatible.\u001b[0m\n", "\u001b[31mboto3 1.9.72 has requirement botocore<1.13.0,>=1.12.72, but you'll have botocore 1.12.66 which is incompatible.\u001b[0m\n", "\u001b[31mawscli 1.16.76 has requirement PyYAML<=3.13,>=3.10, but you'll have pyyaml 5.3 which is incompatible.\u001b[0m\n", "\u001b[31mtensorboard 2.1.0 has requirement requests<3,>=2.21.0, but you'll have requests 2.20.0 which is incompatible.\u001b[0m\n", "Installing collected packages: torch, six, grpcio, scipy, astor, opt-einsum, google-pasta, h5py, keras-applications, absl-py, werkzeug, oauthlib, requests-oauthlib, cachetools, pyasn1-modules, google-auth, google-auth-oauthlib, markdown, tensorboard, tensorflow-estimator, keras-preprocessing, gast, termcolor, tensorflow, tqdm, sentencepiece, torchtext, portalocker, typing, sacrebleu, subword-nmt, seaborn, pyyaml, joeynmt\n", "\u001b[31mCould not install packages due to an EnvironmentError: [Errno 13] Permission denied: '/usr/lib64/python3.6/dist-packages/torch-1.4.0.dist-info'\n", "Consider using the `--user` option or check the permissions.\n", "\u001b[0m\n", "\u001b[33mYou are using pip version 19.0.2, however version 20.0.2 is available.\n", "You should consider upgrading via the 'pip install --upgrade pip' command.\u001b[0m\n" ] } ], "source": [ "# Install JoeyNMT\n", "! git clone https://github.com/joeynmt/joeynmt.git\n", "! cd joeynmt; pip3 install ." ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "AaE77Tcppex9" }, "source": [ "# Preprocessing the Data into Subword BPE Tokens\n", "\n", "- One of the most powerful improvements for agglutinative languages (a feature of most Bantu languages) is using BPE tokenization [ (Sennrich, 2015) ](https://arxiv.org/abs/1508.07909).\n", "\n", "- It was also shown that by optimizing the umber of BPE codes we significantly improve results for low-resourced languages [(Sennrich, 2019)](https://www.aclweb.org/anthology/P19-1021) [(Martinus, 2019)](https://arxiv.org/abs/1906.05685)\n", "\n", "- Below we have the scripts for doing BPE tokenization of our data. We use 4000 tokens as recommended by [(Sennrich, 2019)](https://www.aclweb.org/anthology/P19-1021). You do not need to change anything. Simply running the below will be suitable. " ] }, { "cell_type": "code", "execution_count": 13, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Collecting subword-nmt\n", " Downloading https://files.pythonhosted.org/packages/74/60/6600a7bc09e7ab38bc53a48a20d8cae49b837f93f5842a41fe513a694912/subword_nmt-0.3.7-py2.py3-none-any.whl\n", "Installing collected packages: subword-nmt\n", "\u001b[33m The script subword-nmt is installed in '/usr/local/bin' which is not on PATH.\n", " Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.\u001b[0m\n", "Successfully installed subword-nmt-0.3.7\n", "\u001b[33mYou are using pip version 19.0.2, however version 20.0.2 is available.\n", "You should consider upgrading via the 'pip install --upgrade pip' command.\u001b[0m\n" ] } ], "source": [ "!sudo pip3 install subword-nmt" ] }, { "cell_type": "code", "execution_count": 14, "metadata": { "colab": {}, "colab_type": "code", "id": "H-TyjtmXB1mL" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "bpe.codes.4000\tdev.en\t test.bpe.pcm test.pcm\t train.en\n", "dev.bpe.en\tdev.pcm test.en\t train.bpe.en train.pcm\n", "dev.bpe.pcm\ttest.bpe.en test.en-any.en train.bpe.pcm\n", "cp: cannot create regular file ‘’: No such file or directory\n", "dev.bpe.en dev.pcm\t test.en\t train.bpe.en train.pcm\n", "dev.bpe.pcm test.bpe.en test.en-any.en train.bpe.pcm\n", "dev.en\t test.bpe.pcm test.pcm\t train.en\n", "BPE Pidgin Sentences\n", "( b ) Wetin you go get for mind sey you no go do ?\n", "book . “ Na wetin I learn for Proverbs 27 : ​ 11 , Matthew 26 : ​ 52 , and John 13 : ​ 35 , give me mind wey mek I no join army .\n", "And even when the wahala come , na wetin dey for this Bible verse dem , still help me fit bear the wahala . ” ​ — An@@ dri@@ y , wey come from U@@ k@@ ra@@ ine .\n", "“ Na wetin dey Isaiah 2 : 4 help me mek I no follow fight war , even when dem want mek I join .\n", "I just dey think for my mind how trouble no go dey the new world .\n", "Combined BPE Vocab\n", "ʺ\n", "satisfac@@\n", "righte@@\n", "ʼ@@\n", "selves\n", "fts\n", "circu@@\n", "gaz@@\n", "integr@@\n", "Ñ@@\n" ] } ], "source": [ "# One of the huge boosts in NMT performance was to use a different method of tokenizing. \n", "# Usually, NMT would tokenize by words. However, using a method called BPE gave amazing boosts to performance\n", "\n", "# Do subword NMT\n", "from os import path\n", "os.environ[\"src\"] = source_language # Sets them in bash as well, since we often use bash scripts\n", "os.environ[\"tgt\"] = target_language\n", "\n", "# Learn BPEs on the training data.\n", "os.environ[\"data_path\"] = path.join(\"joeynmt\", \"data\", source_language + target_language) # Herman! \n", "! subword-nmt learn-joint-bpe-and-vocab --input train.$src train.$tgt -s 4000 -o bpe.codes.4000 --write-vocabulary vocab.$src vocab.$tgt\n", "\n", "# Apply BPE splits to the development and test data.\n", "! subword-nmt apply-bpe -c bpe.codes.4000 --vocabulary vocab.$src < train.$src > train.bpe.$src\n", "! subword-nmt apply-bpe -c bpe.codes.4000 --vocabulary vocab.$tgt < train.$tgt > train.bpe.$tgt\n", "\n", "! subword-nmt apply-bpe -c bpe.codes.4000 --vocabulary vocab.$src < dev.$src > dev.bpe.$src\n", "! subword-nmt apply-bpe -c bpe.codes.4000 --vocabulary vocab.$tgt < dev.$tgt > dev.bpe.$tgt\n", "! subword-nmt apply-bpe -c bpe.codes.4000 --vocabulary vocab.$src < test.$src > test.bpe.$src\n", "! subword-nmt apply-bpe -c bpe.codes.4000 --vocabulary vocab.$tgt < test.$tgt > test.bpe.$tgt\n", "\n", "# Create directory, move everyone we care about to the correct location\n", "! mkdir -p $data_path\n", "! cp train.* $data_path\n", "! cp test.* $data_path\n", "! cp dev.* $data_path\n", "! cp bpe.codes.4000 $data_path\n", "! ls $data_path\n", "\n", "# Also move everything we care about to a mounted location in google drive (relevant if running in colab) at gdrive_path\n", "! cp train.* \"$experiment_path\"\n", "! cp test.* \"$experiment_path\"\n", "! cp dev.* \"$experiment_path\"\n", "! cp bpe.codes.4000 \"$gexperiment_path\"\n", "! ls \"$experiment_path\"\n", "\n", "# Create that vocab using build_vocab\n", "! sudo chmod 777 joeynmt/scripts/build_vocab.py\n", "! joeynmt/scripts/build_vocab.py joeynmt/data/$src$tgt/train.bpe.$src joeynmt/data/$src$tgt/train.bpe.$tgt --output_path joeynmt/data/$src$tgt/vocab.txt\n", "\n", "# Some output\n", "! echo \"BPE Pidgin Sentences\"\n", "! tail -n 5 test.bpe.$tgt\n", "! echo \"Combined BPE Vocab\"\n", "! tail -n 10 joeynmt/data/$src$tgt/vocab.txt # Herman" ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "Ixmzi60WsUZ8" }, "source": [ "# Creating the JoeyNMT Config\n", "\n", "JoeyNMT requires a yaml config. We provide a template below. We've also set a number of defaults with it, that you may play with!\n", "\n", "- We used Transformer architecture \n", "- We set our dropout to reasonably high: 0.3 (recommended in [(Sennrich, 2019)](https://www.aclweb.org/anthology/P19-1021))\n", "\n", "Things worth playing with:\n", "- The batch size (also recommended to change for low-resourced languages)\n", "- The number of epochs (we've set it at 30 just so it runs in about an hour, for testing purposes)\n", "- The decoder options (beam_size, alpha)\n", "- Evaluation metrics (BLEU versus Crhf4)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "colab": {}, "colab_type": "code", "id": "PIs1lY2hxMsl" }, "outputs": [], "source": [ "# This creates the config file for our JoeyNMT system. It might seem overwhelming so we've provided a couple of useful parameters you'll need to update\n", "# (You can of course play with all the parameters if you'd like!)\n", "\n", "name = '%s%s' % (source_language, target_language)\n", "gdrive_path = os.environ[\"experiment_path\"]\n", "\n", "# Create the config\n", "config = \"\"\"\n", "name: \"{name}_transformer\"\n", "\n", "data:\n", " src: \"{source_language}\"\n", " trg: \"{target_language}\"\n", " train: \"data/{name}/train.bpe\"\n", " dev: \"data/{name}/dev.bpe\"\n", " test: \"data/{name}/test.bpe\"\n", " level: \"bpe\"\n", " lowercase: False\n", " max_sent_length: 100\n", " src_vocab: \"data/{name}/vocab.txt\"\n", " trg_vocab: \"data/{name}/vocab.txt\"\n", "\n", "testing:\n", " beam_size: 5\n", " alpha: 1.0\n", "\n", "training:\n", " #load_model: \"{experiment_path}/models/{name}_transformer/1.ckpt\" # if uncommented, load a pre-trained model from this checkpoint\n", " random_seed: 42\n", " optimizer: \"adam\"\n", " normalization: \"tokens\"\n", " adam_betas: [0.9, 0.999] \n", " scheduling: \"plateau\" # TODO: try switching from plateau to Noam scheduling\n", " patience: 5 # For plateau: decrease learning rate by decrease_factor if validation score has not improved for this many validation rounds.\n", " learning_rate_factor: 0.5 # factor for Noam scheduler (used with Transformer)\n", " learning_rate_warmup: 1000 # warmup steps for Noam scheduler (used with Transformer)\n", " decrease_factor: 0.7\n", " loss: \"crossentropy\"\n", " learning_rate: 0.0003\n", " learning_rate_min: 0.00000001\n", " weight_decay: 0.0\n", " label_smoothing: 0.1\n", " batch_size: 4096\n", " batch_type: \"token\"\n", " eval_batch_size: 3600\n", " eval_batch_type: \"token\"\n", " batch_multiplier: 1\n", " early_stopping_metric: \"ppl\"\n", " epochs: 200 # TODO: Decrease for when playing around and checking of working. Around 30 is sufficient to check if its working at all\n", " validation_freq: 1000 # TODO: Set to at least once per epoch.\n", " logging_freq: 100\n", " eval_metric: \"bleu\"\n", " model_dir: \"models/{name}_transformer\"\n", " overwrite: True # TODO: Set to True if you want to overwrite possibly existing models. \n", " shuffle: True\n", " use_cuda: True\n", " max_output_length: 100\n", " print_valid_sents: [0, 1, 2, 3]\n", " keep_last_ckpts: 5\n", "\n", "model:\n", " initializer: \"xavier\"\n", " bias_initializer: \"zeros\"\n", " init_gain: 1.0\n", " embed_initializer: \"xavier\"\n", " embed_init_gain: 1.0\n", " tied_embeddings: True\n", " tied_softmax: True\n", " encoder:\n", " type: \"transformer\"\n", " num_layers: 6\n", " num_heads: 4 # TODO: Increase to 8 for larger data.\n", " embeddings:\n", " embedding_dim: 256 # TODO: Increase to 512 for larger data.\n", " scale: True\n", " dropout: 0.2\n", " # typically ff_size = 4 x hidden_size\n", " hidden_size: 256 # TODO: Increase to 512 for larger data.\n", " ff_size: 1024 # TODO: Increase to 2048 for larger data.\n", " dropout: 0.3\n", " decoder:\n", " type: \"transformer\"\n", " num_layers: 6\n", " num_heads: 4 # TODO: Increase to 8 for larger data.\n", " embeddings:\n", " embedding_dim: 256 # TODO: Increase to 512 for larger data.\n", " scale: True\n", " dropout: 0.2\n", " # typically ff_size = 4 x hidden_size\n", " hidden_size: 256 # TODO: Increase to 512 for larger data.\n", " ff_size: 1024 # TODO: Increase to 2048 for larger data.\n", " dropout: 0.3\n", "\"\"\".format(name=name, experiment_path=os.environ[\"experiment_path\"], source_language=source_language, target_language=target_language)\n", "with open(\"joeynmt/configs/transformer_{name}.yaml\".format(name=name),'w') as f:\n", " f.write(config)" ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "pIifxE3Qzuvs" }, "source": [ "# Train the Model\n", "\n", "This single line of joeynmt runs the training using the config we made above" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "##### Extra Installs\n", "Because I am running on AWS, I need to do some extra installs because of environment conflicts. " ] }, { "cell_type": "code", "execution_count": 16, "metadata": { "scrolled": true }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Solving environment: done\n", "\n", "\n", "==> WARNING: A newer version of conda exists. <==\n", " current version: 4.5.12\n", " latest version: 4.8.2\n", "\n", "Please update conda by running\n", "\n", " $ conda update -n base -c defaults conda\n", "\n", "\n", "\n", "## Package Plan ##\n", "\n", " environment location: /home/ec2-user/anaconda3/envs/python3\n", "\n", " added / updated specs: \n", " - cudatoolkit=10.1\n", " - pytorch\n", " - torchvision\n", "\n", "\n", "The following packages will be downloaded:\n", "\n", " package | build\n", " ---------------------------|-----------------\n", " ca-certificates-2020.1.1 | 0 132 KB\n", " pytorch-1.4.0 |py3.6_cuda10.1.243_cudnn7.6.3_0 432.9 MB pytorch\n", " openssl-1.0.2u | h7b6447c_0 3.1 MB\n", " certifi-2019.11.28 | py36_0 156 KB\n", " torchvision-0.5.0 | py36_cu101 9.1 MB pytorch\n", " ------------------------------------------------------------\n", " Total: 445.4 MB\n", "\n", "The following NEW packages will be INSTALLED:\n", "\n", " cudatoolkit: 10.1.243-h6bb024c_0 \n", " ninja: 1.9.0-py36hfd86e86_0 \n", " pytorch: 1.4.0-py3.6_cuda10.1.243_cudnn7.6.3_0 pytorch\n", " torchvision: 0.5.0-py36_cu101 pytorch\n", "\n", "The following packages will be UPDATED:\n", "\n", " ca-certificates: 2019.10.16-0 --> 2020.1.1-0 \n", " certifi: 2019.9.11-py36_0 --> 2019.11.28-py36_0\n", " openssl: 1.0.2t-h7b6447c_1 --> 1.0.2u-h7b6447c_0\n", "\n", "\n", "Downloading and Extracting Packages\n", "ca-certificates-2020 | 132 KB | ##################################### | 100% \n", "pytorch-1.4.0 | 432.9 MB | ##################################### | 100% \n", "openssl-1.0.2u | 3.1 MB | ##################################### | 100% \n", "certifi-2019.11.28 | 156 KB | ##################################### | 100% \n", "torchvision-0.5.0 | 9.1 MB | ##################################### | 100% \n", "Preparing transaction: done\n", "Verifying transaction: done\n", "Executing transaction: done\n" ] } ], "source": [ "!conda install pytorch torchvision cudatoolkit=10.1 -c pytorch --yes" ] }, { "cell_type": "code", "execution_count": 17, "metadata": { "scrolled": true }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Solving environment: done\n", "\n", "\n", "==> WARNING: A newer version of conda exists. <==\n", " current version: 4.5.12\n", " latest version: 4.8.2\n", "\n", "Please update conda by running\n", "\n", " $ conda update -n base -c defaults conda\n", "\n", "\n", "\n", "## Package Plan ##\n", "\n", " environment location: /home/ec2-user/anaconda3/envs/python3\n", "\n", " added / updated specs: \n", " - tensorboard\n", "\n", "\n", "The following packages will be downloaded:\n", "\n", " package | build\n", " ---------------------------|-----------------\n", " c-ares-1.15.0 | h7b6447c_1001 102 KB\n", " absl-py-0.8.1 | py36_0 161 KB\n", " markdown-3.1.1 | py36_0 113 KB\n", " numpy-1.18.1 | py36h4f9e942_0 5 KB\n", " scikit-learn-0.22.1 | py36hd81dba3_0 7.1 MB\n", " tensorboard-2.0.0 | pyhb38c66f_1 3.3 MB\n", " mkl-2020.0 | 166 202.1 MB\n", " scipy-1.4.1 | py36h0b6359f_0 18.9 MB\n", " numpy-base-1.18.1 | py36hde5b4d6_1 5.2 MB\n", " six-1.14.0 | py36_0 27 KB\n", " setuptools-45.1.0 | py36_0 648 KB\n", " grpcio-1.14.1 | py36h9ba97e2_0 1.0 MB\n", " numexpr-2.7.1 | py36h423224d_0 197 KB\n", " joblib-0.14.1 | py_0 202 KB\n", " ------------------------------------------------------------\n", " Total: 239.1 MB\n", "\n", "The following NEW packages will be INSTALLED:\n", "\n", " absl-py: 0.8.1-py36_0 \n", " c-ares: 1.15.0-h7b6447c_1001 \n", " grpcio: 1.14.1-py36h9ba97e2_0\n", " joblib: 0.14.1-py_0 \n", " markdown: 3.1.1-py36_0 \n", " tensorboard: 2.0.0-pyhb38c66f_1 \n", "\n", "The following packages will be UPDATED:\n", "\n", " mkl: 2018.0.2-1 --> 2020.0-166 \n", " mkl-service: 1.1.2-py36h17a0993_4 --> 2.3.0-py36he904b0f_0 \n", " numexpr: 2.6.5-py36h7bf3b9c_0 --> 2.7.1-py36h423224d_0 \n", " numpy: 1.14.3-py36hcd700cb_1 --> 1.18.1-py36h4f9e942_0\n", " numpy-base: 1.14.3-py36h9be14a7_1 --> 1.18.1-py36hde5b4d6_1\n", " scikit-learn: 0.19.1-py36h7aa7ec6_0 --> 0.22.1-py36hd81dba3_0\n", " scipy: 1.1.0-py36hfc37229_0 --> 1.4.1-py36h0b6359f_0 \n", " setuptools: 39.1.0-py36_0 --> 45.1.0-py36_0 \n", " six: 1.11.0-py36h372c433_1 --> 1.14.0-py36_0 \n", "\n", "\n", "Downloading and Extracting Packages\n", "c-ares-1.15.0 | 102 KB | ##################################### | 100% \n", "absl-py-0.8.1 | 161 KB | ##################################### | 100% \n", "markdown-3.1.1 | 113 KB | ##################################### | 100% \n", "numpy-1.18.1 | 5 KB | ##################################### | 100% \n", "scikit-learn-0.22.1 | 7.1 MB | ##################################### | 100% \n", "tensorboard-2.0.0 | 3.3 MB | ##################################### | 100% \n", "mkl-2020.0 | 202.1 MB | ##################################### | 100% \n", "scipy-1.4.1 | 18.9 MB | ##################################### | 100% \n", "numpy-base-1.18.1 | 5.2 MB | ##################################### | 100% \n", "six-1.14.0 | 27 KB | ##################################### | 100% \n", "setuptools-45.1.0 | 648 KB | ##################################### | 100% \n", "grpcio-1.14.1 | 1.0 MB | ##################################### | 100% \n", "numexpr-2.7.1 | 197 KB | ##################################### | 100% \n", "joblib-0.14.1 | 202 KB | ##################################### | 100% \n", "Preparing transaction: done\n", "Verifying transaction: done\n", "Executing transaction: done\n" ] } ], "source": [ "!conda install tensorboard --yes" ] }, { "cell_type": "code", "execution_count": 18, "metadata": { "scrolled": true }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Solving environment: done\n", "\n", "\n", "==> WARNING: A newer version of conda exists. <==\n", " current version: 4.5.12\n", " latest version: 4.8.2\n", "\n", "Please update conda by running\n", "\n", " $ conda update -n base -c defaults conda\n", "\n", "\n", "\n", "## Package Plan ##\n", "\n", " environment location: /home/ec2-user/anaconda3/envs/python3\n", "\n", " added / updated specs: \n", " - torchtext\n", "\n", "\n", "The following packages will be downloaded:\n", "\n", " package | build\n", " ---------------------------|-----------------\n", " tqdm-4.42.0 | py_0 56 KB\n", " torchtext-0.5.0 | py_1 1.4 MB pytorch\n", " ------------------------------------------------------------\n", " Total: 1.5 MB\n", "\n", "The following NEW packages will be INSTALLED:\n", "\n", " torchtext: 0.5.0-py_1 pytorch\n", " tqdm: 4.42.0-py_0 \n", "\n", "\n", "Downloading and Extracting Packages\n", "tqdm-4.42.0 | 56 KB | ##################################### | 100% \n", "torchtext-0.5.0 | 1.4 MB | ##################################### | 100% \n", "Preparing transaction: done\n", "Verifying transaction: done\n", "Executing transaction: done\n" ] } ], "source": [ "!conda install -c pytorch torchtext --yes" ] }, { "cell_type": "code", "execution_count": 19, "metadata": { "scrolled": true }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Solving environment: done\n", "\n", "\n", "==> WARNING: A newer version of conda exists. <==\n", " current version: 4.5.12\n", " latest version: 4.8.2\n", "\n", "Please update conda by running\n", "\n", " $ conda update -n base -c defaults conda\n", "\n", "\n", "\n", "## Package Plan ##\n", "\n", " environment location: /home/ec2-user/anaconda3/envs/python3\n", "\n", " added / updated specs: \n", " - sentencepiece\n", "\n", "\n", "The following packages will be downloaded:\n", "\n", " package | build\n", " ---------------------------|-----------------\n", " astor-0.8.0 | py36_0 45 KB\n", " tensorflow-2.0.0 |mkl_py36hef7ec59_0 3 KB\n", " sentencepiece-0.1.84 | py36h6bb024c_0 3.1 MB powerai\n", " keras-applications-1.0.8 | py_0 33 KB\n", " tensorflow-base-2.0.0 |mkl_py36h9204916_0 100.9 MB\n", " gast-0.2.2 | py36_0 138 KB\n", " google-pasta-0.1.8 | py_0 43 KB\n", " opt_einsum-3.1.0 | py_0 54 KB\n", " _tflow_select-2.3.0 | mkl 2 KB\n", " keras-preprocessing-1.1.0 | py_1 36 KB\n", " termcolor-1.1.0 | py36_1 7 KB\n", " tensorflow-estimator-2.0.0 | pyh2649769_0 272 KB\n", " ------------------------------------------------------------\n", " Total: 104.6 MB\n", "\n", "The following NEW packages will be INSTALLED:\n", "\n", " _tflow_select: 2.3.0-mkl \n", " astor: 0.8.0-py36_0 \n", " gast: 0.2.2-py36_0 \n", " google-pasta: 0.1.8-py_0 \n", " keras-applications: 1.0.8-py_0 \n", " keras-preprocessing: 1.1.0-py_1 \n", " opt_einsum: 3.1.0-py_0 \n", " sentencepiece: 0.1.84-py36h6bb024c_0 powerai\n", " tensorflow: 2.0.0-mkl_py36hef7ec59_0 \n", " tensorflow-base: 2.0.0-mkl_py36h9204916_0 \n", " tensorflow-estimator: 2.0.0-pyh2649769_0 \n", " termcolor: 1.1.0-py36_1 \n", "\n", "The following packages will be UPDATED:\n", "\n", " wrapt: 1.10.11-py36h28b7045_0 --> 1.11.2-py36h7b6447c_0\n", "\n", "\n", "Downloading and Extracting Packages\n", "astor-0.8.0 | 45 KB | ##################################### | 100% \n", "tensorflow-2.0.0 | 3 KB | ##################################### | 100% \n", "sentencepiece-0.1.84 | 3.1 MB | ##################################### | 100% \n", "keras-applications-1 | 33 KB | ##################################### | 100% \n", "tensorflow-base-2.0. | 100.9 MB | ##################################### | 100% \n", "gast-0.2.2 | 138 KB | ##################################### | 100% \n", "google-pasta-0.1.8 | 43 KB | ##################################### | 100% \n", "opt_einsum-3.1.0 | 54 KB | ##################################### | 100% \n", "_tflow_select-2.3.0 | 2 KB | ##################################### | 100% \n", "keras-preprocessing- | 36 KB | ##################################### | 100% \n", "termcolor-1.1.0 | 7 KB | ##################################### | 100% \n", "tensorflow-estimator | 272 KB | ##################################### | 100% \n", "Preparing transaction: done\n", "Verifying transaction: done\n", "Executing transaction: done\n" ] } ], "source": [ "!conda install -c powerai sentencepiece --yes" ] }, { "cell_type": "code", "execution_count": 20, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Solving environment: done\n", "\n", "\n", "==> WARNING: A newer version of conda exists. <==\n", " current version: 4.5.12\n", " latest version: 4.8.2\n", "\n", "Please update conda by running\n", "\n", " $ conda update -n base -c defaults conda\n", "\n", "\n", "\n", "## Package Plan ##\n", "\n", " environment location: /home/ec2-user/anaconda3/envs/python3\n", "\n", " added / updated specs: \n", " - sacrebleu\n", "\n", "\n", "The following packages will be downloaded:\n", "\n", " package | build\n", " ---------------------------|-----------------\n", " sacrebleu-1.4.3 | py_0 34 KB powerai\n", " portalocker-1.5.2 | py36_0 20 KB\n", " ------------------------------------------------------------\n", " Total: 54 KB\n", "\n", "The following NEW packages will be INSTALLED:\n", "\n", " portalocker: 1.5.2-py36_0 \n", " sacrebleu: 1.4.3-py_0 powerai\n", "\n", "\n", "Downloading and Extracting Packages\n", "sacrebleu-1.4.3 | 34 KB | ##################################### | 100% \n", "portalocker-1.5.2 | 20 KB | ##################################### | 100% \n", "Preparing transaction: done\n", "Verifying transaction: done\n", "Executing transaction: done\n" ] } ], "source": [ "!conda install -c powerai sacrebleu --yes" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "##### Train the model" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "colab": {}, "colab_type": "code", "id": "6ZBPFwT94WpI" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "2020-02-09 13:08:37,412 Hello! This is Joey-NMT.\n", "2020-02-09 13:08:40,281 Total params: 12099840\n", "2020-02-09 13:08:40,283 Trainable parameters: ['decoder.layer_norm.bias', 'decoder.layer_norm.weight', 'decoder.layers.0.dec_layer_norm.bias', 'decoder.layers.0.dec_layer_norm.weight', 'decoder.layers.0.feed_forward.layer_norm.bias', 'decoder.layers.0.feed_forward.layer_norm.weight', 'decoder.layers.0.feed_forward.pwff_layer.0.bias', 'decoder.layers.0.feed_forward.pwff_layer.0.weight', 'decoder.layers.0.feed_forward.pwff_layer.3.bias', 'decoder.layers.0.feed_forward.pwff_layer.3.weight', 'decoder.layers.0.src_trg_att.k_layer.bias', 'decoder.layers.0.src_trg_att.k_layer.weight', 'decoder.layers.0.src_trg_att.output_layer.bias', 'decoder.layers.0.src_trg_att.output_layer.weight', 'decoder.layers.0.src_trg_att.q_layer.bias', 'decoder.layers.0.src_trg_att.q_layer.weight', 'decoder.layers.0.src_trg_att.v_layer.bias', 'decoder.layers.0.src_trg_att.v_layer.weight', 'decoder.layers.0.trg_trg_att.k_layer.bias', 'decoder.layers.0.trg_trg_att.k_layer.weight', 'decoder.layers.0.trg_trg_att.output_layer.bias', 'decoder.layers.0.trg_trg_att.output_layer.weight', 'decoder.layers.0.trg_trg_att.q_layer.bias', 'decoder.layers.0.trg_trg_att.q_layer.weight', 'decoder.layers.0.trg_trg_att.v_layer.bias', 'decoder.layers.0.trg_trg_att.v_layer.weight', 'decoder.layers.0.x_layer_norm.bias', 'decoder.layers.0.x_layer_norm.weight', 'decoder.layers.1.dec_layer_norm.bias', 'decoder.layers.1.dec_layer_norm.weight', 'decoder.layers.1.feed_forward.layer_norm.bias', 'decoder.layers.1.feed_forward.layer_norm.weight', 'decoder.layers.1.feed_forward.pwff_layer.0.bias', 'decoder.layers.1.feed_forward.pwff_layer.0.weight', 'decoder.layers.1.feed_forward.pwff_layer.3.bias', 'decoder.layers.1.feed_forward.pwff_layer.3.weight', 'decoder.layers.1.src_trg_att.k_layer.bias', 'decoder.layers.1.src_trg_att.k_layer.weight', 'decoder.layers.1.src_trg_att.output_layer.bias', 'decoder.layers.1.src_trg_att.output_layer.weight', 'decoder.layers.1.src_trg_att.q_layer.bias', 'decoder.layers.1.src_trg_att.q_layer.weight', 'decoder.layers.1.src_trg_att.v_layer.bias', 'decoder.layers.1.src_trg_att.v_layer.weight', 'decoder.layers.1.trg_trg_att.k_layer.bias', 'decoder.layers.1.trg_trg_att.k_layer.weight', 'decoder.layers.1.trg_trg_att.output_layer.bias', 'decoder.layers.1.trg_trg_att.output_layer.weight', 'decoder.layers.1.trg_trg_att.q_layer.bias', 'decoder.layers.1.trg_trg_att.q_layer.weight', 'decoder.layers.1.trg_trg_att.v_layer.bias', 'decoder.layers.1.trg_trg_att.v_layer.weight', 'decoder.layers.1.x_layer_norm.bias', 'decoder.layers.1.x_layer_norm.weight', 'decoder.layers.2.dec_layer_norm.bias', 'decoder.layers.2.dec_layer_norm.weight', 'decoder.layers.2.feed_forward.layer_norm.bias', 'decoder.layers.2.feed_forward.layer_norm.weight', 'decoder.layers.2.feed_forward.pwff_layer.0.bias', 'decoder.layers.2.feed_forward.pwff_layer.0.weight', 'decoder.layers.2.feed_forward.pwff_layer.3.bias', 'decoder.layers.2.feed_forward.pwff_layer.3.weight', 'decoder.layers.2.src_trg_att.k_layer.bias', 'decoder.layers.2.src_trg_att.k_layer.weight', 'decoder.layers.2.src_trg_att.output_layer.bias', 'decoder.layers.2.src_trg_att.output_layer.weight', 'decoder.layers.2.src_trg_att.q_layer.bias', 'decoder.layers.2.src_trg_att.q_layer.weight', 'decoder.layers.2.src_trg_att.v_layer.bias', 'decoder.layers.2.src_trg_att.v_layer.weight', 'decoder.layers.2.trg_trg_att.k_layer.bias', 'decoder.layers.2.trg_trg_att.k_layer.weight', 'decoder.layers.2.trg_trg_att.output_layer.bias', 'decoder.layers.2.trg_trg_att.output_layer.weight', 'decoder.layers.2.trg_trg_att.q_layer.bias', 'decoder.layers.2.trg_trg_att.q_layer.weight', 'decoder.layers.2.trg_trg_att.v_layer.bias', 'decoder.layers.2.trg_trg_att.v_layer.weight', 'decoder.layers.2.x_layer_norm.bias', 'decoder.layers.2.x_layer_norm.weight', 'decoder.layers.3.dec_layer_norm.bias', 'decoder.layers.3.dec_layer_norm.weight', 'decoder.layers.3.feed_forward.layer_norm.bias', 'decoder.layers.3.feed_forward.layer_norm.weight', 'decoder.layers.3.feed_forward.pwff_layer.0.bias', 'decoder.layers.3.feed_forward.pwff_layer.0.weight', 'decoder.layers.3.feed_forward.pwff_layer.3.bias', 'decoder.layers.3.feed_forward.pwff_layer.3.weight', 'decoder.layers.3.src_trg_att.k_layer.bias', 'decoder.layers.3.src_trg_att.k_layer.weight', 'decoder.layers.3.src_trg_att.output_layer.bias', 'decoder.layers.3.src_trg_att.output_layer.weight', 'decoder.layers.3.src_trg_att.q_layer.bias', 'decoder.layers.3.src_trg_att.q_layer.weight', 'decoder.layers.3.src_trg_att.v_layer.bias', 'decoder.layers.3.src_trg_att.v_layer.weight', 'decoder.layers.3.trg_trg_att.k_layer.bias', 'decoder.layers.3.trg_trg_att.k_layer.weight', 'decoder.layers.3.trg_trg_att.output_layer.bias', 'decoder.layers.3.trg_trg_att.output_layer.weight', 'decoder.layers.3.trg_trg_att.q_layer.bias', 'decoder.layers.3.trg_trg_att.q_layer.weight', 'decoder.layers.3.trg_trg_att.v_layer.bias', 'decoder.layers.3.trg_trg_att.v_layer.weight', 'decoder.layers.3.x_layer_norm.bias', 'decoder.layers.3.x_layer_norm.weight', 'decoder.layers.4.dec_layer_norm.bias', 'decoder.layers.4.dec_layer_norm.weight', 'decoder.layers.4.feed_forward.layer_norm.bias', 'decoder.layers.4.feed_forward.layer_norm.weight', 'decoder.layers.4.feed_forward.pwff_layer.0.bias', 'decoder.layers.4.feed_forward.pwff_layer.0.weight', 'decoder.layers.4.feed_forward.pwff_layer.3.bias', 'decoder.layers.4.feed_forward.pwff_layer.3.weight', 'decoder.layers.4.src_trg_att.k_layer.bias', 'decoder.layers.4.src_trg_att.k_layer.weight', 'decoder.layers.4.src_trg_att.output_layer.bias', 'decoder.layers.4.src_trg_att.output_layer.weight', 'decoder.layers.4.src_trg_att.q_layer.bias', 'decoder.layers.4.src_trg_att.q_layer.weight', 'decoder.layers.4.src_trg_att.v_layer.bias', 'decoder.layers.4.src_trg_att.v_layer.weight', 'decoder.layers.4.trg_trg_att.k_layer.bias', 'decoder.layers.4.trg_trg_att.k_layer.weight', 'decoder.layers.4.trg_trg_att.output_layer.bias', 'decoder.layers.4.trg_trg_att.output_layer.weight', 'decoder.layers.4.trg_trg_att.q_layer.bias', 'decoder.layers.4.trg_trg_att.q_layer.weight', 'decoder.layers.4.trg_trg_att.v_layer.bias', 'decoder.layers.4.trg_trg_att.v_layer.weight', 'decoder.layers.4.x_layer_norm.bias', 'decoder.layers.4.x_layer_norm.weight', 'decoder.layers.5.dec_layer_norm.bias', 'decoder.layers.5.dec_layer_norm.weight', 'decoder.layers.5.feed_forward.layer_norm.bias', 'decoder.layers.5.feed_forward.layer_norm.weight', 'decoder.layers.5.feed_forward.pwff_layer.0.bias', 'decoder.layers.5.feed_forward.pwff_layer.0.weight', 'decoder.layers.5.feed_forward.pwff_layer.3.bias', 'decoder.layers.5.feed_forward.pwff_layer.3.weight', 'decoder.layers.5.src_trg_att.k_layer.bias', 'decoder.layers.5.src_trg_att.k_layer.weight', 'decoder.layers.5.src_trg_att.output_layer.bias', 'decoder.layers.5.src_trg_att.output_layer.weight', 'decoder.layers.5.src_trg_att.q_layer.bias', 'decoder.layers.5.src_trg_att.q_layer.weight', 'decoder.layers.5.src_trg_att.v_layer.bias', 'decoder.layers.5.src_trg_att.v_layer.weight', 'decoder.layers.5.trg_trg_att.k_layer.bias', 'decoder.layers.5.trg_trg_att.k_layer.weight', 'decoder.layers.5.trg_trg_att.output_layer.bias', 'decoder.layers.5.trg_trg_att.output_layer.weight', 'decoder.layers.5.trg_trg_att.q_layer.bias', 'decoder.layers.5.trg_trg_att.q_layer.weight', 'decoder.layers.5.trg_trg_att.v_layer.bias', 'decoder.layers.5.trg_trg_att.v_layer.weight', 'decoder.layers.5.x_layer_norm.bias', 'decoder.layers.5.x_layer_norm.weight', 'encoder.layer_norm.bias', 'encoder.layer_norm.weight', 'encoder.layers.0.feed_forward.layer_norm.bias', 'encoder.layers.0.feed_forward.layer_norm.weight', 'encoder.layers.0.feed_forward.pwff_layer.0.bias', 'encoder.layers.0.feed_forward.pwff_layer.0.weight', 'encoder.layers.0.feed_forward.pwff_layer.3.bias', 'encoder.layers.0.feed_forward.pwff_layer.3.weight', 'encoder.layers.0.layer_norm.bias', 'encoder.layers.0.layer_norm.weight', 'encoder.layers.0.src_src_att.k_layer.bias', 'encoder.layers.0.src_src_att.k_layer.weight', 'encoder.layers.0.src_src_att.output_layer.bias', 'encoder.layers.0.src_src_att.output_layer.weight', 'encoder.layers.0.src_src_att.q_layer.bias', 'encoder.layers.0.src_src_att.q_layer.weight', 'encoder.layers.0.src_src_att.v_layer.bias', 'encoder.layers.0.src_src_att.v_layer.weight', 'encoder.layers.1.feed_forward.layer_norm.bias', 'encoder.layers.1.feed_forward.layer_norm.weight', 'encoder.layers.1.feed_forward.pwff_layer.0.bias', 'encoder.layers.1.feed_forward.pwff_layer.0.weight', 'encoder.layers.1.feed_forward.pwff_layer.3.bias', 'encoder.layers.1.feed_forward.pwff_layer.3.weight', 'encoder.layers.1.layer_norm.bias', 'encoder.layers.1.layer_norm.weight', 'encoder.layers.1.src_src_att.k_layer.bias', 'encoder.layers.1.src_src_att.k_layer.weight', 'encoder.layers.1.src_src_att.output_layer.bias', 'encoder.layers.1.src_src_att.output_layer.weight', 'encoder.layers.1.src_src_att.q_layer.bias', 'encoder.layers.1.src_src_att.q_layer.weight', 'encoder.layers.1.src_src_att.v_layer.bias', 'encoder.layers.1.src_src_att.v_layer.weight', 'encoder.layers.2.feed_forward.layer_norm.bias', 'encoder.layers.2.feed_forward.layer_norm.weight', 'encoder.layers.2.feed_forward.pwff_layer.0.bias', 'encoder.layers.2.feed_forward.pwff_layer.0.weight', 'encoder.layers.2.feed_forward.pwff_layer.3.bias', 'encoder.layers.2.feed_forward.pwff_layer.3.weight', 'encoder.layers.2.layer_norm.bias', 'encoder.layers.2.layer_norm.weight', 'encoder.layers.2.src_src_att.k_layer.bias', 'encoder.layers.2.src_src_att.k_layer.weight', 'encoder.layers.2.src_src_att.output_layer.bias', 'encoder.layers.2.src_src_att.output_layer.weight', 'encoder.layers.2.src_src_att.q_layer.bias', 'encoder.layers.2.src_src_att.q_layer.weight', 'encoder.layers.2.src_src_att.v_layer.bias', 'encoder.layers.2.src_src_att.v_layer.weight', 'encoder.layers.3.feed_forward.layer_norm.bias', 'encoder.layers.3.feed_forward.layer_norm.weight', 'encoder.layers.3.feed_forward.pwff_layer.0.bias', 'encoder.layers.3.feed_forward.pwff_layer.0.weight', 'encoder.layers.3.feed_forward.pwff_layer.3.bias', 'encoder.layers.3.feed_forward.pwff_layer.3.weight', 'encoder.layers.3.layer_norm.bias', 'encoder.layers.3.layer_norm.weight', 'encoder.layers.3.src_src_att.k_layer.bias', 'encoder.layers.3.src_src_att.k_layer.weight', 'encoder.layers.3.src_src_att.output_layer.bias', 'encoder.layers.3.src_src_att.output_layer.weight', 'encoder.layers.3.src_src_att.q_layer.bias', 'encoder.layers.3.src_src_att.q_layer.weight', 'encoder.layers.3.src_src_att.v_layer.bias', 'encoder.layers.3.src_src_att.v_layer.weight', 'encoder.layers.4.feed_forward.layer_norm.bias', 'encoder.layers.4.feed_forward.layer_norm.weight', 'encoder.layers.4.feed_forward.pwff_layer.0.bias', 'encoder.layers.4.feed_forward.pwff_layer.0.weight', 'encoder.layers.4.feed_forward.pwff_layer.3.bias', 'encoder.layers.4.feed_forward.pwff_layer.3.weight', 'encoder.layers.4.layer_norm.bias', 'encoder.layers.4.layer_norm.weight', 'encoder.layers.4.src_src_att.k_layer.bias', 'encoder.layers.4.src_src_att.k_layer.weight', 'encoder.layers.4.src_src_att.output_layer.bias', 'encoder.layers.4.src_src_att.output_layer.weight', 'encoder.layers.4.src_src_att.q_layer.bias', 'encoder.layers.4.src_src_att.q_layer.weight', 'encoder.layers.4.src_src_att.v_layer.bias', 'encoder.layers.4.src_src_att.v_layer.weight', 'encoder.layers.5.feed_forward.layer_norm.bias', 'encoder.layers.5.feed_forward.layer_norm.weight', 'encoder.layers.5.feed_forward.pwff_layer.0.bias', 'encoder.layers.5.feed_forward.pwff_layer.0.weight', 'encoder.layers.5.feed_forward.pwff_layer.3.bias', 'encoder.layers.5.feed_forward.pwff_layer.3.weight', 'encoder.layers.5.layer_norm.bias', 'encoder.layers.5.layer_norm.weight', 'encoder.layers.5.src_src_att.k_layer.bias', 'encoder.layers.5.src_src_att.k_layer.weight', 'encoder.layers.5.src_src_att.output_layer.bias', 'encoder.layers.5.src_src_att.output_layer.weight', 'encoder.layers.5.src_src_att.q_layer.bias', 'encoder.layers.5.src_src_att.q_layer.weight', 'encoder.layers.5.src_src_att.v_layer.bias', 'encoder.layers.5.src_src_att.v_layer.weight', 'src_embed.lut.weight']\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "2020-02-09 13:08:44,832 cfg.name : enpcm_transformer\n", "2020-02-09 13:08:44,832 cfg.data.src : en\n", "2020-02-09 13:08:44,832 cfg.data.trg : pcm\n", "2020-02-09 13:08:44,832 cfg.data.train : data/enpcm/train.bpe\n", "2020-02-09 13:08:44,832 cfg.data.dev : data/enpcm/dev.bpe\n", "2020-02-09 13:08:44,832 cfg.data.test : data/enpcm/test.bpe\n", "2020-02-09 13:08:44,832 cfg.data.level : bpe\n", "2020-02-09 13:08:44,832 cfg.data.lowercase : False\n", "2020-02-09 13:08:44,832 cfg.data.max_sent_length : 100\n", "2020-02-09 13:08:44,832 cfg.data.src_vocab : data/enpcm/vocab.txt\n", "2020-02-09 13:08:44,833 cfg.data.trg_vocab : data/enpcm/vocab.txt\n", "2020-02-09 13:08:44,833 cfg.testing.beam_size : 5\n", "2020-02-09 13:08:44,833 cfg.testing.alpha : 1.0\n", "2020-02-09 13:08:44,833 cfg.training.random_seed : 42\n", "2020-02-09 13:08:44,833 cfg.training.optimizer : adam\n", "2020-02-09 13:08:44,833 cfg.training.normalization : tokens\n", "2020-02-09 13:08:44,833 cfg.training.adam_betas : [0.9, 0.999]\n", "2020-02-09 13:08:44,833 cfg.training.scheduling : plateau\n", "2020-02-09 13:08:44,833 cfg.training.patience : 5\n", "2020-02-09 13:08:44,833 cfg.training.learning_rate_factor : 0.5\n", "2020-02-09 13:08:44,833 cfg.training.learning_rate_warmup : 1000\n", "2020-02-09 13:08:44,833 cfg.training.decrease_factor : 0.7\n", "2020-02-09 13:08:44,833 cfg.training.loss : crossentropy\n", "2020-02-09 13:08:44,833 cfg.training.learning_rate : 0.0003\n", "2020-02-09 13:08:44,833 cfg.training.learning_rate_min : 1e-08\n", "2020-02-09 13:08:44,833 cfg.training.weight_decay : 0.0\n", "2020-02-09 13:08:44,834 cfg.training.label_smoothing : 0.1\n", "2020-02-09 13:08:44,834 cfg.training.batch_size : 4096\n", "2020-02-09 13:08:44,834 cfg.training.batch_type : token\n", "2020-02-09 13:08:44,834 cfg.training.eval_batch_size : 3600\n", "2020-02-09 13:08:44,834 cfg.training.eval_batch_type : token\n", "2020-02-09 13:08:44,834 cfg.training.batch_multiplier : 1\n", "2020-02-09 13:08:44,834 cfg.training.early_stopping_metric : ppl\n", "2020-02-09 13:08:44,834 cfg.training.epochs : 200\n", "2020-02-09 13:08:44,834 cfg.training.validation_freq : 1000\n", "2020-02-09 13:08:44,834 cfg.training.logging_freq : 100\n", "2020-02-09 13:08:44,834 cfg.training.eval_metric : bleu\n", "2020-02-09 13:08:44,834 cfg.training.model_dir : models/enpcm_transformer\n", "2020-02-09 13:08:44,834 cfg.training.overwrite : True\n", "2020-02-09 13:08:44,834 cfg.training.shuffle : True\n", "2020-02-09 13:08:44,834 cfg.training.use_cuda : True\n", "2020-02-09 13:08:44,834 cfg.training.max_output_length : 100\n", "2020-02-09 13:08:44,834 cfg.training.print_valid_sents : [0, 1, 2, 3]\n", "2020-02-09 13:08:44,835 cfg.training.keep_last_ckpts : 5\n", "2020-02-09 13:08:44,835 cfg.model.initializer : xavier\n", "2020-02-09 13:08:44,835 cfg.model.bias_initializer : zeros\n", "2020-02-09 13:08:44,835 cfg.model.init_gain : 1.0\n", "2020-02-09 13:08:44,835 cfg.model.embed_initializer : xavier\n", "2020-02-09 13:08:44,835 cfg.model.embed_init_gain : 1.0\n", "2020-02-09 13:08:44,835 cfg.model.tied_embeddings : True\n", "2020-02-09 13:08:44,835 cfg.model.tied_softmax : True\n", "2020-02-09 13:08:44,835 cfg.model.encoder.type : transformer\n", "2020-02-09 13:08:44,835 cfg.model.encoder.num_layers : 6\n", "2020-02-09 13:08:44,835 cfg.model.encoder.num_heads : 4\n", "2020-02-09 13:08:44,835 cfg.model.encoder.embeddings.embedding_dim : 256\n", "2020-02-09 13:08:44,835 cfg.model.encoder.embeddings.scale : True\n", "2020-02-09 13:08:44,835 cfg.model.encoder.embeddings.dropout : 0.2\n", "2020-02-09 13:08:44,835 cfg.model.encoder.hidden_size : 256\n", "2020-02-09 13:08:44,835 cfg.model.encoder.ff_size : 1024\n", "2020-02-09 13:08:44,836 cfg.model.encoder.dropout : 0.3\n", "2020-02-09 13:08:44,836 cfg.model.decoder.type : transformer\n", "2020-02-09 13:08:44,836 cfg.model.decoder.num_layers : 6\n", "2020-02-09 13:08:44,836 cfg.model.decoder.num_heads : 4\n", "2020-02-09 13:08:44,836 cfg.model.decoder.embeddings.embedding_dim : 256\n", "2020-02-09 13:08:44,836 cfg.model.decoder.embeddings.scale : True\n", "2020-02-09 13:08:44,836 cfg.model.decoder.embeddings.dropout : 0.2\n", "2020-02-09 13:08:44,836 cfg.model.decoder.hidden_size : 256\n", "2020-02-09 13:08:44,836 cfg.model.decoder.ff_size : 1024\n", "2020-02-09 13:08:44,836 cfg.model.decoder.dropout : 0.3\n", "2020-02-09 13:08:44,836 Data set sizes: \n", "\ttrain 20214,\n", "\tvalid 1000,\n", "\ttest 2101\n", "2020-02-09 13:08:44,836 First training example:\n", "\t[SRC] JE@@ H@@ O@@ V@@ A@@ H@@ ’@@ S servants high@@ ly est@@ e@@ em God’s own holy book , the Bible .\n", "\t[TRG] JE@@ H@@ O@@ V@@ A@@ H people value Bible well well because dem know sey na God holy book .\n", "2020-02-09 13:08:44,836 First 10 words (src): (0) (1) (2) (3) (4) . (5) , (6) the (7) to (8) and (9) dey\n", "2020-02-09 13:08:44,837 First 10 words (trg): (0) (1) (2) (3) (4) . (5) , (6) the (7) to (8) and (9) dey\n", "2020-02-09 13:08:44,837 Number of Src words (types): 4061\n", "2020-02-09 13:08:44,838 Number of Trg words (types): 4061\n", "2020-02-09 13:08:44,838 Model(\n", "\tencoder=TransformerEncoder(num_layers=6, num_heads=4),\n", "\tdecoder=TransformerDecoder(num_layers=6, num_heads=4),\n", "\tsrc_embed=Embeddings(embedding_dim=256, vocab_size=4061),\n", "\ttrg_embed=Embeddings(embedding_dim=256, vocab_size=4061))\n", "2020-02-09 13:08:44,842 EPOCH 1\n", "2020-02-09 13:09:01,154 Epoch 1 Step: 100 Batch Loss: 4.827715 Tokens per Sec: 13096, Lr: 0.000300\n", "2020-02-09 13:09:10,991 Epoch 1 Step: 200 Batch Loss: 4.614224 Tokens per Sec: 20943, Lr: 0.000300\n", "2020-02-09 13:09:12,666 Epoch 1: total training loss 1071.55\n", "2020-02-09 13:09:12,666 EPOCH 2\n", "2020-02-09 13:09:20,841 Epoch 2 Step: 300 Batch Loss: 4.062344 Tokens per Sec: 21707, Lr: 0.000300\n", "2020-02-09 13:09:30,628 Epoch 2 Step: 400 Batch Loss: 3.815988 Tokens per Sec: 21269, Lr: 0.000300\n", "2020-02-09 13:09:33,876 Epoch 2: total training loss 892.39\n", "2020-02-09 13:09:33,876 EPOCH 3\n", "2020-02-09 13:09:40,491 Epoch 3 Step: 500 Batch Loss: 3.743718 Tokens per Sec: 21765, Lr: 0.000300\n", "2020-02-09 13:09:50,349 Epoch 3 Step: 600 Batch Loss: 4.020455 Tokens per Sec: 21096, Lr: 0.000300\n", "2020-02-09 13:09:55,169 Epoch 3: total training loss 791.86\n", "2020-02-09 13:09:55,170 EPOCH 4\n", "2020-02-09 13:10:00,217 Epoch 4 Step: 700 Batch Loss: 3.402556 Tokens per Sec: 20972, Lr: 0.000300\n", "2020-02-09 13:10:10,054 Epoch 4 Step: 800 Batch Loss: 3.404273 Tokens per Sec: 21883, Lr: 0.000300\n", "2020-02-09 13:10:16,470 Epoch 4: total training loss 740.73\n", "2020-02-09 13:10:16,471 EPOCH 5\n", "2020-02-09 13:10:19,994 Epoch 5 Step: 900 Batch Loss: 3.135571 Tokens per Sec: 21375, Lr: 0.000300\n", "2020-02-09 13:10:29,865 Epoch 5 Step: 1000 Batch Loss: 3.272286 Tokens per Sec: 21446, Lr: 0.000300\n", "2020-02-09 13:10:43,280 Hooray! New best validation result [ppl]!\n", "2020-02-09 13:10:43,280 Saving new checkpoint.\n", "2020-02-09 13:10:43,449 Example #0\n", "2020-02-09 13:10:43,450 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 13:10:43,450 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 13:10:43,450 \tHypothesis: E no be sey : ‘ Jehovah no dey do wetin Jehovah want . ’ ​ — 1 Cor .\n", "2020-02-09 13:10:43,450 Example #1\n", "2020-02-09 13:10:43,450 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 13:10:43,450 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 13:10:43,450 \tHypothesis: ( Acts 3 : 1 ) But , we dey talk sey : ‘ When we dey call the brothers and sisters wey dey call the congregation , and sisters wey dey call am .\n", "2020-02-09 13:10:43,450 Example #2\n", "2020-02-09 13:10:43,450 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 13:10:43,451 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 13:10:43,451 \tHypothesis: ( b ) Wetin we dey do to do wetin Jehovah want make dem dey do for the kind thing wey dem dey do for the kind thing wey dem dey do .\n", "2020-02-09 13:10:43,451 Example #3\n", "2020-02-09 13:10:43,451 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 13:10:43,451 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 13:10:43,451 \tHypothesis: ( Acts 3 : 1 ) But we no dey do wetin we dey do , we go do wetin we go do .\n", "2020-02-09 13:10:43,451 Validation result (greedy) at epoch 5, step 1000: bleu: 2.06, loss: 73394.2344, ppl: 24.6552, duration: 13.5851s\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "2020-02-09 13:10:51,461 Epoch 5: total training loss 712.52\n", "2020-02-09 13:10:51,461 EPOCH 6\n", "2020-02-09 13:10:53,275 Epoch 6 Step: 1100 Batch Loss: 3.265706 Tokens per Sec: 21850, Lr: 0.000300\n", "2020-02-09 13:11:03,098 Epoch 6 Step: 1200 Batch Loss: 3.291728 Tokens per Sec: 21540, Lr: 0.000300\n", "2020-02-09 13:11:12,990 Epoch 6 Step: 1300 Batch Loss: 2.943817 Tokens per Sec: 20834, Lr: 0.000300\n", "2020-02-09 13:11:12,991 Epoch 6: total training loss 684.58\n", "2020-02-09 13:11:12,991 EPOCH 7\n", "2020-02-09 13:11:22,883 Epoch 7 Step: 1400 Batch Loss: 2.914759 Tokens per Sec: 21785, Lr: 0.000300\n", "2020-02-09 13:11:32,637 Epoch 7 Step: 1500 Batch Loss: 3.037468 Tokens per Sec: 21424, Lr: 0.000300\n", "2020-02-09 13:11:34,103 Epoch 7: total training loss 649.63\n", "2020-02-09 13:11:34,104 EPOCH 8\n", "2020-02-09 13:11:42,433 Epoch 8 Step: 1600 Batch Loss: 2.900786 Tokens per Sec: 21571, Lr: 0.000300\n", "2020-02-09 13:11:52,268 Epoch 8 Step: 1700 Batch Loss: 3.398431 Tokens per Sec: 21592, Lr: 0.000300\n", "2020-02-09 13:11:55,311 Epoch 8: total training loss 628.21\n", "2020-02-09 13:11:55,312 EPOCH 9\n", "2020-02-09 13:12:02,142 Epoch 9 Step: 1800 Batch Loss: 2.778916 Tokens per Sec: 21428, Lr: 0.000300\n", "2020-02-09 13:12:11,969 Epoch 9 Step: 1900 Batch Loss: 2.773610 Tokens per Sec: 21280, Lr: 0.000300\n", "2020-02-09 13:12:16,674 Epoch 9: total training loss 612.58\n", "2020-02-09 13:12:16,674 EPOCH 10\n", "2020-02-09 13:12:21,831 Epoch 10 Step: 2000 Batch Loss: 2.661200 Tokens per Sec: 21429, Lr: 0.000300\n", "2020-02-09 13:12:34,991 Hooray! New best validation result [ppl]!\n", "2020-02-09 13:12:34,991 Saving new checkpoint.\n", "2020-02-09 13:12:35,158 Example #0\n", "2020-02-09 13:12:35,158 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 13:12:35,158 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 13:12:35,158 \tHypothesis: Jehovah no dey do wetin e talk . E sey : ‘ Jehovah no dey do wetin e talk . ’\n", "2020-02-09 13:12:35,158 Example #1\n", "2020-02-09 13:12:35,159 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 13:12:35,159 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 13:12:35,159 \tHypothesis: E talk sey : “ E dey hard us to preach to our house and we dey go preach for where we dey call Toooooine .\n", "2020-02-09 13:12:35,159 Example #2\n", "2020-02-09 13:12:35,159 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 13:12:35,159 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 13:12:35,159 \tHypothesis: E no be sey , Jehovah dey bless us and dey help dem get faith and dey do wetin dem want .\n", "2020-02-09 13:12:35,159 Example #3\n", "2020-02-09 13:12:35,159 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 13:12:35,159 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 13:12:35,160 \tHypothesis: If we dey do wetin Jehovah want , we go do wetin we go do to do .\n", "2020-02-09 13:12:35,160 Validation result (greedy) at epoch 10, step 2000: bleu: 4.45, loss: 62560.9414, ppl: 15.3623, duration: 13.3281s\n", "2020-02-09 13:12:44,997 Epoch 10 Step: 2100 Batch Loss: 1.763262 Tokens per Sec: 21487, Lr: 0.000300\n", "2020-02-09 13:12:51,278 Epoch 10: total training loss 588.89\n", "2020-02-09 13:12:51,279 EPOCH 11\n", "2020-02-09 13:12:54,851 Epoch 11 Step: 2200 Batch Loss: 2.823848 Tokens per Sec: 21730, Lr: 0.000300\n", "2020-02-09 13:13:04,681 Epoch 11 Step: 2300 Batch Loss: 2.997061 Tokens per Sec: 21470, Lr: 0.000300\n", "2020-02-09 13:13:12,584 Epoch 11: total training loss 572.20\n", "2020-02-09 13:13:12,584 EPOCH 12\n", "2020-02-09 13:13:14,589 Epoch 12 Step: 2400 Batch Loss: 2.575049 Tokens per Sec: 20842, Lr: 0.000300\n", "2020-02-09 13:13:24,479 Epoch 12 Step: 2500 Batch Loss: 2.632221 Tokens per Sec: 21450, Lr: 0.000300\n", "2020-02-09 13:13:33,887 Epoch 12: total training loss 553.62\n", "2020-02-09 13:13:33,887 EPOCH 13\n", "2020-02-09 13:13:34,417 Epoch 13 Step: 2600 Batch Loss: 1.527499 Tokens per Sec: 16468, Lr: 0.000300\n", "2020-02-09 13:13:44,302 Epoch 13 Step: 2700 Batch Loss: 1.940548 Tokens per Sec: 21794, Lr: 0.000300\n", "2020-02-09 13:13:54,158 Epoch 13 Step: 2800 Batch Loss: 2.280367 Tokens per Sec: 21558, Lr: 0.000300\n", "2020-02-09 13:13:55,049 Epoch 13: total training loss 535.02\n", "2020-02-09 13:13:55,050 EPOCH 14\n", "2020-02-09 13:14:04,072 Epoch 14 Step: 2900 Batch Loss: 2.631029 Tokens per Sec: 21414, Lr: 0.000300\n", "2020-02-09 13:14:14,098 Epoch 14 Step: 3000 Batch Loss: 2.661278 Tokens per Sec: 21137, Lr: 0.000300\n", "2020-02-09 13:14:29,606 Hooray! New best validation result [ppl]!\n", "2020-02-09 13:14:29,606 Saving new checkpoint.\n", "2020-02-09 13:14:29,774 Example #0\n", "2020-02-09 13:14:29,774 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 13:14:29,774 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 13:14:29,774 \tHypothesis: Jehovah no do wetin e do , but e no want make Jehovah happy .\n", "2020-02-09 13:14:29,774 Example #1\n", "2020-02-09 13:14:29,774 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 13:14:29,774 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 13:14:29,774 \tHypothesis: E talk sey : “ We come dey go preach for one place wey we dey call one village wey we dey call one village wey we dey call one village ( wey we dey call call one place ) .\n", "2020-02-09 13:14:29,774 Example #2\n", "2020-02-09 13:14:29,775 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 13:14:29,775 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 13:14:29,775 \tHypothesis: E no easy for us to love Jehovah and we love dem .\n", "2020-02-09 13:14:29,775 Example #3\n", "2020-02-09 13:14:29,775 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 13:14:29,775 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 13:14:29,775 \tHypothesis: As we dey do wetin Jehovah want , we no go happy . E dey sure sey we dey happy .\n", "2020-02-09 13:14:29,775 Validation result (greedy) at epoch 14, step 3000: bleu: 7.74, loss: 56698.6953, ppl: 11.8927, duration: 15.6763s\n", "2020-02-09 13:14:32,186 Epoch 14: total training loss 525.88\n", "2020-02-09 13:14:32,186 EPOCH 15\n", "2020-02-09 13:14:39,857 Epoch 15 Step: 3100 Batch Loss: 2.564197 Tokens per Sec: 21377, Lr: 0.000300\n", "2020-02-09 13:14:49,763 Epoch 15 Step: 3200 Batch Loss: 1.934755 Tokens per Sec: 21579, Lr: 0.000300\n", "2020-02-09 13:14:53,607 Epoch 15: total training loss 513.42\n", "2020-02-09 13:14:53,607 EPOCH 16\n", "2020-02-09 13:14:59,689 Epoch 16 Step: 3300 Batch Loss: 2.391761 Tokens per Sec: 21625, Lr: 0.000300\n", "2020-02-09 13:15:09,624 Epoch 16 Step: 3400 Batch Loss: 2.533697 Tokens per Sec: 21201, Lr: 0.000300\n", "2020-02-09 13:15:15,027 Epoch 16: total training loss 505.16\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "2020-02-09 13:15:15,027 EPOCH 17\n", "2020-02-09 13:15:19,493 Epoch 17 Step: 3500 Batch Loss: 2.162553 Tokens per Sec: 21523, Lr: 0.000300\n", "2020-02-09 13:15:29,371 Epoch 17 Step: 3600 Batch Loss: 2.326906 Tokens per Sec: 21915, Lr: 0.000300\n", "2020-02-09 13:15:36,243 Epoch 17: total training loss 491.85\n", "2020-02-09 13:15:36,244 EPOCH 18\n", "2020-02-09 13:15:39,246 Epoch 18 Step: 3700 Batch Loss: 2.380447 Tokens per Sec: 21543, Lr: 0.000300\n", "2020-02-09 13:15:49,155 Epoch 18 Step: 3800 Batch Loss: 2.020103 Tokens per Sec: 21464, Lr: 0.000300\n", "2020-02-09 13:15:57,601 Epoch 18: total training loss 487.15\n", "2020-02-09 13:15:57,602 EPOCH 19\n", "2020-02-09 13:15:59,043 Epoch 19 Step: 3900 Batch Loss: 2.084379 Tokens per Sec: 19882, Lr: 0.000300\n", "2020-02-09 13:16:08,997 Epoch 19 Step: 4000 Batch Loss: 2.317714 Tokens per Sec: 21471, Lr: 0.000300\n", "2020-02-09 13:16:19,854 Hooray! New best validation result [ppl]!\n", "2020-02-09 13:16:19,855 Saving new checkpoint.\n", "2020-02-09 13:16:20,020 Example #0\n", "2020-02-09 13:16:20,020 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 13:16:20,020 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 13:16:20,020 \tHypothesis: Jehovah no want make Job no do wetin e want . But e no do wetin Satan talk .\n", "2020-02-09 13:16:20,020 Example #1\n", "2020-02-09 13:16:20,021 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 13:16:20,021 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 13:16:20,021 \tHypothesis: After , dem come talk sey : “ We come dey go house for where we dey go preach for where we dey go .\n", "2020-02-09 13:16:20,021 Example #2\n", "2020-02-09 13:16:20,021 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 13:16:20,021 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 13:16:20,021 \tHypothesis: E no easy for people to love theirself and dey show love dem .\n", "2020-02-09 13:16:20,021 Example #3\n", "2020-02-09 13:16:20,022 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 13:16:20,022 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 13:16:20,022 \tHypothesis: Because we love Jehovah and we love am , we no go happy sey we need to dey happy .\n", "2020-02-09 13:16:20,022 Validation result (greedy) at epoch 19, step 4000: bleu: 8.94, loss: 53580.6719, ppl: 10.3788, duration: 11.0240s\n", "2020-02-09 13:16:29,848 Epoch 19 Step: 4100 Batch Loss: 2.156765 Tokens per Sec: 21434, Lr: 0.000300\n", "2020-02-09 13:16:30,045 Epoch 19: total training loss 476.53\n", "2020-02-09 13:16:30,045 EPOCH 20\n", "2020-02-09 13:16:39,801 Epoch 20 Step: 4200 Batch Loss: 2.439323 Tokens per Sec: 21410, Lr: 0.000300\n", "2020-02-09 13:16:49,787 Epoch 20 Step: 4300 Batch Loss: 2.393922 Tokens per Sec: 21434, Lr: 0.000300\n", "2020-02-09 13:16:51,468 Epoch 20: total training loss 466.71\n", "2020-02-09 13:16:51,468 EPOCH 21\n", "2020-02-09 13:16:59,803 Epoch 21 Step: 4400 Batch Loss: 2.056876 Tokens per Sec: 21059, Lr: 0.000300\n", "2020-02-09 13:17:09,793 Epoch 21 Step: 4500 Batch Loss: 2.316501 Tokens per Sec: 21219, Lr: 0.000300\n", "2020-02-09 13:17:13,105 Epoch 21: total training loss 457.23\n", "2020-02-09 13:17:13,106 EPOCH 22\n", "2020-02-09 13:17:19,952 Epoch 22 Step: 4600 Batch Loss: 2.074961 Tokens per Sec: 21435, Lr: 0.000300\n", "2020-02-09 13:17:29,924 Epoch 22 Step: 4700 Batch Loss: 1.109493 Tokens per Sec: 20771, Lr: 0.000300\n", "2020-02-09 13:17:34,798 Epoch 22: total training loss 454.24\n", "2020-02-09 13:17:34,799 EPOCH 23\n", "2020-02-09 13:17:39,934 Epoch 23 Step: 4800 Batch Loss: 2.000134 Tokens per Sec: 20982, Lr: 0.000300\n", "2020-02-09 13:17:49,934 Epoch 23 Step: 4900 Batch Loss: 2.151453 Tokens per Sec: 21163, Lr: 0.000300\n", "2020-02-09 13:17:56,626 Epoch 23: total training loss 450.66\n", "2020-02-09 13:17:56,626 EPOCH 24\n", "2020-02-09 13:17:59,974 Epoch 24 Step: 5000 Batch Loss: 2.229525 Tokens per Sec: 21697, Lr: 0.000300\n", "2020-02-09 13:18:13,124 Hooray! New best validation result [ppl]!\n", "2020-02-09 13:18:13,124 Saving new checkpoint.\n", "2020-02-09 13:18:13,291 Example #0\n", "2020-02-09 13:18:13,291 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 13:18:13,291 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 13:18:13,291 \tHypothesis: Jehovah no let Job do wetin e do . E sey : ‘ Na your hand e don do for you . ’\n", "2020-02-09 13:18:13,291 Example #1\n", "2020-02-09 13:18:13,291 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 13:18:13,291 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 13:18:13,291 \tHypothesis: Dem talk sey : “ We come comot for where we go stay for where we dey stay for where we dey stay . We come comot for where we dey stay .\n", "2020-02-09 13:18:13,292 Example #2\n", "2020-02-09 13:18:13,292 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 13:18:13,292 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 13:18:13,292 \tHypothesis: E no easy for dem to dey show love theirself and dey show love dem .\n", "2020-02-09 13:18:13,292 Example #3\n", "2020-02-09 13:18:13,292 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 13:18:13,292 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 13:18:13,292 \tHypothesis: As we love Jehovah and e love us , we know sey e go really happy .\n", "2020-02-09 13:18:13,292 Validation result (greedy) at epoch 24, step 5000: bleu: 10.12, loss: 51855.6953, ppl: 9.6257, duration: 13.3177s\n", "2020-02-09 13:18:23,119 Epoch 24 Step: 5100 Batch Loss: 2.074046 Tokens per Sec: 21824, Lr: 0.000300\n", "2020-02-09 13:18:31,275 Epoch 24: total training loss 439.66\n", "2020-02-09 13:18:31,275 EPOCH 25\n", "2020-02-09 13:18:32,990 Epoch 25 Step: 5200 Batch Loss: 2.131806 Tokens per Sec: 21127, Lr: 0.000300\n", "2020-02-09 13:18:42,814 Epoch 25 Step: 5300 Batch Loss: 2.180349 Tokens per Sec: 21469, Lr: 0.000300\n", "2020-02-09 13:18:52,591 Epoch 25 Step: 5400 Batch Loss: 2.024933 Tokens per Sec: 21256, Lr: 0.000300\n", "2020-02-09 13:18:52,689 Epoch 25: total training loss 438.46\n", "2020-02-09 13:18:52,689 EPOCH 26\n", "2020-02-09 13:19:02,421 Epoch 26 Step: 5500 Batch Loss: 1.761211 Tokens per Sec: 21334, Lr: 0.000300\n", "2020-02-09 13:19:12,238 Epoch 26 Step: 5600 Batch Loss: 2.269718 Tokens per Sec: 21752, Lr: 0.000300\n", "2020-02-09 13:19:13,900 Epoch 26: total training loss 425.87\n", "2020-02-09 13:19:13,900 EPOCH 27\n", "2020-02-09 13:19:22,071 Epoch 27 Step: 5700 Batch Loss: 1.753395 Tokens per Sec: 21273, Lr: 0.000300\n", "2020-02-09 13:19:31,889 Epoch 27 Step: 5800 Batch Loss: 2.062914 Tokens per Sec: 21619, Lr: 0.000300\n", "2020-02-09 13:19:35,229 Epoch 27: total training loss 422.60\n", "2020-02-09 13:19:35,230 EPOCH 28\n", "2020-02-09 13:19:41,731 Epoch 28 Step: 5900 Batch Loss: 1.870739 Tokens per Sec: 20986, Lr: 0.000300\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "2020-02-09 13:19:51,539 Epoch 28 Step: 6000 Batch Loss: 2.292132 Tokens per Sec: 22003, Lr: 0.000300\n", "2020-02-09 13:20:03,538 Hooray! New best validation result [ppl]!\n", "2020-02-09 13:20:03,539 Saving new checkpoint.\n", "2020-02-09 13:20:03,726 Example #0\n", "2020-02-09 13:20:03,726 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 13:20:03,726 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 13:20:03,726 \tHypothesis: Jehovah no let Job do wetin e do . But e no do wetin e do . E sey : ‘ Na you be the God wey dey do you . ’\n", "2020-02-09 13:20:03,726 Example #1\n", "2020-02-09 13:20:03,726 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 13:20:03,727 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 13:20:03,727 \tHypothesis: Dem come talk sey : “ We come comot for our house and we come comot for where we dey stay for evening .\n", "2020-02-09 13:20:03,727 Example #2\n", "2020-02-09 13:20:03,727 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 13:20:03,727 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 13:20:03,727 \tHypothesis: Ronurse ( 1 ) dey show love and love dey make people happy when dem dey do things together .\n", "2020-02-09 13:20:03,727 Example #3\n", "2020-02-09 13:20:03,727 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 13:20:03,727 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 13:20:03,727 \tHypothesis: ( John 3 : ​ 1 - 3 ) As we dey show sey Jehovah love us , we dey happy and we dey happy .\n", "2020-02-09 13:20:03,727 Validation result (greedy) at epoch 28, step 6000: bleu: 11.03, loss: 50861.6523, ppl: 9.2168, duration: 12.1881s\n", "2020-02-09 13:20:08,735 Epoch 28: total training loss 417.83\n", "2020-02-09 13:20:08,736 EPOCH 29\n", "2020-02-09 13:20:13,728 Epoch 29 Step: 6100 Batch Loss: 2.070772 Tokens per Sec: 20693, Lr: 0.000300\n", "2020-02-09 13:20:23,554 Epoch 29 Step: 6200 Batch Loss: 1.767040 Tokens per Sec: 21648, Lr: 0.000300\n", "2020-02-09 13:20:30,203 Epoch 29: total training loss 411.22\n", "2020-02-09 13:20:30,203 EPOCH 30\n", "2020-02-09 13:20:33,383 Epoch 30 Step: 6300 Batch Loss: 2.193435 Tokens per Sec: 21178, Lr: 0.000300\n", "2020-02-09 13:20:43,198 Epoch 30 Step: 6400 Batch Loss: 2.159051 Tokens per Sec: 21522, Lr: 0.000300\n", "2020-02-09 13:20:51,537 Epoch 30: total training loss 403.51\n", "2020-02-09 13:20:51,537 EPOCH 31\n", "2020-02-09 13:20:53,142 Epoch 31 Step: 6500 Batch Loss: 0.741025 Tokens per Sec: 20335, Lr: 0.000300\n", "2020-02-09 13:21:02,992 Epoch 31 Step: 6600 Batch Loss: 1.625787 Tokens per Sec: 21500, Lr: 0.000300\n", "2020-02-09 13:21:12,830 Epoch 31 Step: 6700 Batch Loss: 1.612840 Tokens per Sec: 21371, Lr: 0.000300\n", "2020-02-09 13:21:12,931 Epoch 31: total training loss 401.02\n", "2020-02-09 13:21:12,932 EPOCH 32\n", "2020-02-09 13:21:22,663 Epoch 32 Step: 6800 Batch Loss: 1.733360 Tokens per Sec: 21536, Lr: 0.000300\n", "2020-02-09 13:21:32,473 Epoch 32 Step: 6900 Batch Loss: 1.805076 Tokens per Sec: 21552, Lr: 0.000300\n", "2020-02-09 13:21:34,139 Epoch 32: total training loss 394.55\n", "2020-02-09 13:21:34,139 EPOCH 33\n", "2020-02-09 13:21:42,326 Epoch 33 Step: 7000 Batch Loss: 1.426562 Tokens per Sec: 21610, Lr: 0.000300\n", "2020-02-09 13:21:53,898 Hooray! New best validation result [ppl]!\n", "2020-02-09 13:21:53,898 Saving new checkpoint.\n", "2020-02-09 13:21:54,085 Example #0\n", "2020-02-09 13:21:54,085 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 13:21:54,085 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 13:21:54,085 \tHypothesis: Jehovah no let Job do wetin e no like . E sey : ‘ Satan do you for your hand . ’\n", "2020-02-09 13:21:54,086 Example #1\n", "2020-02-09 13:21:54,086 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 13:21:54,086 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 13:21:54,086 \tHypothesis: Corinna : “ We come comot for our place for where we dey stay for evening . Dem come comot for where we dey stay .\n", "2020-02-09 13:21:54,086 Example #2\n", "2020-02-09 13:21:54,086 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 13:21:54,086 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 13:21:54,086 \tHypothesis: Rolllllors ( wey dey give pikin love theirself ) , dey show love im family and children .\n", "2020-02-09 13:21:54,086 Example #3\n", "2020-02-09 13:21:54,086 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 13:21:54,087 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 13:21:54,087 \tHypothesis: We know sey Jehovah love us and e dey happy well well .\n", "2020-02-09 13:21:54,087 Validation result (greedy) at epoch 33, step 7000: bleu: 11.48, loss: 50434.1406, ppl: 9.0464, duration: 11.7605s\n", "2020-02-09 13:22:03,973 Epoch 33 Step: 7100 Batch Loss: 1.780194 Tokens per Sec: 21274, Lr: 0.000300\n", "2020-02-09 13:22:07,320 Epoch 33: total training loss 390.49\n", "2020-02-09 13:22:07,320 EPOCH 34\n", "2020-02-09 13:22:13,857 Epoch 34 Step: 7200 Batch Loss: 0.929090 Tokens per Sec: 20885, Lr: 0.000300\n", "2020-02-09 13:22:23,781 Epoch 34 Step: 7300 Batch Loss: 2.028444 Tokens per Sec: 21139, Lr: 0.000300\n", "2020-02-09 13:22:28,797 Epoch 34: total training loss 386.96\n", "2020-02-09 13:22:28,797 EPOCH 35\n", "2020-02-09 13:22:33,632 Epoch 35 Step: 7400 Batch Loss: 1.744092 Tokens per Sec: 20742, Lr: 0.000300\n", "2020-02-09 13:22:43,493 Epoch 35 Step: 7500 Batch Loss: 1.600927 Tokens per Sec: 21578, Lr: 0.000300\n", "2020-02-09 13:22:50,093 Epoch 35: total training loss 380.53\n", "2020-02-09 13:22:50,093 EPOCH 36\n", "2020-02-09 13:22:53,379 Epoch 36 Step: 7600 Batch Loss: 1.619538 Tokens per Sec: 21408, Lr: 0.000300\n", "2020-02-09 13:23:03,279 Epoch 36 Step: 7700 Batch Loss: 1.546237 Tokens per Sec: 21128, Lr: 0.000300\n", "2020-02-09 13:23:11,579 Epoch 36: total training loss 377.24\n", "2020-02-09 13:23:11,580 EPOCH 37\n", "2020-02-09 13:23:13,286 Epoch 37 Step: 7800 Batch Loss: 1.759560 Tokens per Sec: 20812, Lr: 0.000300\n", "2020-02-09 13:23:23,110 Epoch 37 Step: 7900 Batch Loss: 1.936061 Tokens per Sec: 21229, Lr: 0.000300\n", "2020-02-09 13:23:32,932 Epoch 37 Step: 8000 Batch Loss: 1.861286 Tokens per Sec: 21484, Lr: 0.000300\n", "2020-02-09 13:23:44,046 Hooray! New best validation result [ppl]!\n", "2020-02-09 13:23:44,046 Saving new checkpoint.\n", "2020-02-09 13:23:44,231 Example #0\n", "2020-02-09 13:23:44,231 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 13:23:44,231 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 13:23:44,231 \tHypothesis: Jehovah no let Job do wetin e talk . E sey : ‘ Na you be the God wey dey do for you . ’\n", "2020-02-09 13:23:44,231 Example #1\n", "2020-02-09 13:23:44,231 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 13:23:44,231 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 13:23:44,232 \tHypothesis: Corinna comot for prison , e come sey : ‘ We come comot for where we dey stay for where we dey stay for where we dey stay .\n", "2020-02-09 13:23:44,232 Example #2\n", "2020-02-09 13:23:44,232 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 13:23:44,232 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 13:23:44,232 \tHypothesis: Ronurse ( wey love dey show love , and wey dey show love their pikin , dey show sey dem love dem .\n", "2020-02-09 13:23:44,232 Example #3\n", "2020-02-09 13:23:44,232 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 13:23:44,232 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 13:23:44,232 \tHypothesis: As we love Jehovah and e know wetin e want , we need to really happy .\n", "2020-02-09 13:23:44,232 Validation result (greedy) at epoch 37, step 8000: bleu: 12.06, loss: 50059.2305, ppl: 8.8995, duration: 11.3002s\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "2020-02-09 13:23:44,333 Epoch 37: total training loss 372.70\n", "2020-02-09 13:23:44,333 EPOCH 38\n", "2020-02-09 13:23:54,141 Epoch 38 Step: 8100 Batch Loss: 1.849720 Tokens per Sec: 21401, Lr: 0.000300\n", "2020-02-09 13:24:04,016 Epoch 38 Step: 8200 Batch Loss: 1.739787 Tokens per Sec: 21653, Lr: 0.000300\n", "2020-02-09 13:24:05,685 Epoch 38: total training loss 367.20\n", "2020-02-09 13:24:05,685 EPOCH 39\n", "2020-02-09 13:24:13,938 Epoch 39 Step: 8300 Batch Loss: 1.978657 Tokens per Sec: 21301, Lr: 0.000300\n", "2020-02-09 13:24:23,831 Epoch 39 Step: 8400 Batch Loss: 1.544337 Tokens per Sec: 21534, Lr: 0.000300\n", "2020-02-09 13:24:27,069 Epoch 39: total training loss 363.03\n", "2020-02-09 13:24:27,069 EPOCH 40\n", "2020-02-09 13:24:33,676 Epoch 40 Step: 8500 Batch Loss: 1.669919 Tokens per Sec: 21793, Lr: 0.000300\n", "2020-02-09 13:24:43,529 Epoch 40 Step: 8600 Batch Loss: 0.820153 Tokens per Sec: 21331, Lr: 0.000300\n", "2020-02-09 13:24:48,366 Epoch 40: total training loss 360.14\n", "2020-02-09 13:24:48,366 EPOCH 41\n", "2020-02-09 13:24:53,436 Epoch 41 Step: 8700 Batch Loss: 1.652280 Tokens per Sec: 21471, Lr: 0.000300\n", "2020-02-09 13:25:03,307 Epoch 41 Step: 8800 Batch Loss: 1.807984 Tokens per Sec: 21749, Lr: 0.000300\n", "2020-02-09 13:25:09,601 Epoch 41: total training loss 353.74\n", "2020-02-09 13:25:09,602 EPOCH 42\n", "2020-02-09 13:25:13,181 Epoch 42 Step: 8900 Batch Loss: 1.464720 Tokens per Sec: 20739, Lr: 0.000300\n", "2020-02-09 13:25:22,990 Epoch 42 Step: 9000 Batch Loss: 1.595807 Tokens per Sec: 21590, Lr: 0.000300\n", "2020-02-09 13:25:37,129 Example #0\n", "2020-02-09 13:25:37,130 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 13:25:37,130 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 13:25:37,130 \tHypothesis: Jehovah no let Job do wetin e no like . E sey : ‘ Na you be the God wey high pass for you . ’\n", "2020-02-09 13:25:37,130 Example #1\n", "2020-02-09 13:25:37,130 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 13:25:37,130 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 13:25:37,130 \tHypothesis: Corinna : “ We comot for where we dey stay for evening and we dey travel go the place wey we dey call railation ( 15 km ) .\n", "2020-02-09 13:25:37,130 Example #2\n", "2020-02-09 13:25:37,131 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 13:25:37,131 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 13:25:37,131 \tHypothesis: Rolors ( 1 ) We really love each other , and we dey happy as we dey preach .\n", "2020-02-09 13:25:37,131 Example #3\n", "2020-02-09 13:25:37,131 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 13:25:37,131 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 13:25:37,131 \tHypothesis: ( John 6 : 14 ) As Jehovah love us , we know wetin e really need and e dey happy well well .\n", "2020-02-09 13:25:37,131 Validation result (greedy) at epoch 42, step 9000: bleu: 12.70, loss: 50120.6484, ppl: 8.9234, duration: 14.1406s\n", "2020-02-09 13:25:45,018 Epoch 42: total training loss 350.83\n", "2020-02-09 13:25:45,019 EPOCH 43\n", "2020-02-09 13:25:47,028 Epoch 43 Step: 9100 Batch Loss: 1.502720 Tokens per Sec: 21261, Lr: 0.000300\n", "2020-02-09 13:25:56,925 Epoch 43 Step: 9200 Batch Loss: 1.593139 Tokens per Sec: 21399, Lr: 0.000300\n", "2020-02-09 13:26:06,319 Epoch 43: total training loss 346.06\n", "2020-02-09 13:26:06,319 EPOCH 44\n", "2020-02-09 13:26:06,855 Epoch 44 Step: 9300 Batch Loss: 1.669592 Tokens per Sec: 20145, Lr: 0.000300\n", "2020-02-09 13:26:16,765 Epoch 44 Step: 9400 Batch Loss: 1.565888 Tokens per Sec: 21463, Lr: 0.000300\n", "2020-02-09 13:26:26,549 Epoch 44 Step: 9500 Batch Loss: 1.861203 Tokens per Sec: 21414, Lr: 0.000300\n", "2020-02-09 13:26:27,717 Epoch 44: total training loss 345.56\n", "2020-02-09 13:26:27,717 EPOCH 45\n", "2020-02-09 13:26:36,371 Epoch 45 Step: 9600 Batch Loss: 1.794674 Tokens per Sec: 21314, Lr: 0.000300\n", "2020-02-09 13:26:46,197 Epoch 45 Step: 9700 Batch Loss: 1.868837 Tokens per Sec: 21749, Lr: 0.000300\n", "2020-02-09 13:26:49,030 Epoch 45: total training loss 342.00\n", "2020-02-09 13:26:49,030 EPOCH 46\n", "2020-02-09 13:26:56,034 Epoch 46 Step: 9800 Batch Loss: 1.527628 Tokens per Sec: 21328, Lr: 0.000300\n", "2020-02-09 13:27:05,864 Epoch 46 Step: 9900 Batch Loss: 1.642803 Tokens per Sec: 21055, Lr: 0.000300\n", "2020-02-09 13:27:10,475 Epoch 46: total training loss 342.86\n", "2020-02-09 13:27:10,475 EPOCH 47\n", "2020-02-09 13:27:15,741 Epoch 47 Step: 10000 Batch Loss: 1.579610 Tokens per Sec: 20682, Lr: 0.000300\n", "2020-02-09 13:27:26,907 Hooray! New best validation result [ppl]!\n", "2020-02-09 13:27:26,907 Saving new checkpoint.\n", "2020-02-09 13:27:27,091 Example #0\n", "2020-02-09 13:27:27,092 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 13:27:27,092 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 13:27:27,092 \tHypothesis: Jehovah no let Satan do wetin e talk . E sey : ‘ Na you be the number one thing for you . ’\n", "2020-02-09 13:27:27,092 Example #1\n", "2020-02-09 13:27:27,092 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 13:27:27,092 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 13:27:27,092 \tHypothesis: Corinna comot for prison , e sey : “ We comot for where we dey stay for evening and we stay there for 25 miles ( 15 km ) .\n", "2020-02-09 13:27:27,092 Example #2\n", "2020-02-09 13:27:27,092 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 13:27:27,092 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 13:27:27,093 \tHypothesis: Rolors ( wey dey show love , and wey dey show love , dey show love for family and family .\n", "2020-02-09 13:27:27,093 Example #3\n", "2020-02-09 13:27:27,093 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 13:27:27,093 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 13:27:27,093 \tHypothesis: As we love Jehovah and Jehovah , we know wetin e really need and wetin e want make we happy .\n", "2020-02-09 13:27:27,093 Validation result (greedy) at epoch 47, step 10000: bleu: 12.58, loss: 49895.8438, ppl: 8.8362, duration: 11.3514s\n", "2020-02-09 13:27:36,949 Epoch 47 Step: 10100 Batch Loss: 1.688607 Tokens per Sec: 21984, Lr: 0.000300\n", "2020-02-09 13:27:43,026 Epoch 47: total training loss 332.75\n", "2020-02-09 13:27:43,026 EPOCH 48\n", "2020-02-09 13:27:46,786 Epoch 48 Step: 10200 Batch Loss: 1.651279 Tokens per Sec: 21345, Lr: 0.000300\n", "2020-02-09 13:27:56,620 Epoch 48 Step: 10300 Batch Loss: 1.034909 Tokens per Sec: 21429, Lr: 0.000300\n", "2020-02-09 13:28:04,303 Epoch 48: total training loss 330.69\n", "2020-02-09 13:28:04,304 EPOCH 49\n", "2020-02-09 13:28:06,502 Epoch 49 Step: 10400 Batch Loss: 1.507079 Tokens per Sec: 20645, Lr: 0.000300\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "2020-02-09 13:28:16,315 Epoch 49 Step: 10500 Batch Loss: 1.630366 Tokens per Sec: 21649, Lr: 0.000300\n", "2020-02-09 13:28:25,449 Epoch 49: total training loss 325.25\n", "2020-02-09 13:28:25,450 EPOCH 50\n", "2020-02-09 13:28:26,177 Epoch 50 Step: 10600 Batch Loss: 1.349531 Tokens per Sec: 20812, Lr: 0.000300\n", "2020-02-09 13:28:35,985 Epoch 50 Step: 10700 Batch Loss: 1.566842 Tokens per Sec: 21428, Lr: 0.000300\n", "2020-02-09 13:28:45,809 Epoch 50 Step: 10800 Batch Loss: 1.492050 Tokens per Sec: 21841, Lr: 0.000300\n", "2020-02-09 13:28:46,590 Epoch 50: total training loss 321.43\n", "2020-02-09 13:28:46,590 EPOCH 51\n", "2020-02-09 13:28:55,623 Epoch 51 Step: 10900 Batch Loss: 1.727898 Tokens per Sec: 21534, Lr: 0.000300\n", "2020-02-09 13:29:05,462 Epoch 51 Step: 11000 Batch Loss: 1.638900 Tokens per Sec: 21320, Lr: 0.000300\n", "2020-02-09 13:29:16,137 Example #0\n", "2020-02-09 13:29:16,137 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 13:29:16,137 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 13:29:16,137 \tHypothesis: Jehovah no let Satan do wetin e no like . But e let Satan do wetin e no like .\n", "2020-02-09 13:29:16,137 Example #1\n", "2020-02-09 13:29:16,137 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 13:29:16,137 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 13:29:16,137 \tHypothesis: Corinna comot for prison . Dem come dey prison for 25 miles ( 15 km ) .\n", "2020-02-09 13:29:16,138 Example #2\n", "2020-02-09 13:29:16,138 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 13:29:16,138 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 13:29:16,138 \tHypothesis: We dey happy well well when we love our family ( 1 ) dey show love , and ( 3 ) dey show dem love .\n", "2020-02-09 13:29:16,138 Example #3\n", "2020-02-09 13:29:16,138 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 13:29:16,138 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 13:29:16,138 \tHypothesis: We know sey Jehovah love us and e know wetin we need .\n", "2020-02-09 13:29:16,138 Validation result (greedy) at epoch 51, step 11000: bleu: 12.10, loss: 50398.8008, ppl: 9.0324, duration: 10.6758s\n", "2020-02-09 13:29:18,600 Epoch 51: total training loss 322.21\n", "2020-02-09 13:29:18,600 EPOCH 52\n", "2020-02-09 13:29:26,067 Epoch 52 Step: 11100 Batch Loss: 0.603823 Tokens per Sec: 21118, Lr: 0.000300\n", "2020-02-09 13:29:35,950 Epoch 52 Step: 11200 Batch Loss: 1.278073 Tokens per Sec: 21599, Lr: 0.000300\n", "2020-02-09 13:29:39,899 Epoch 52: total training loss 315.59\n", "2020-02-09 13:29:39,899 EPOCH 53\n", "2020-02-09 13:29:45,864 Epoch 53 Step: 11300 Batch Loss: 1.335888 Tokens per Sec: 21391, Lr: 0.000300\n", "2020-02-09 13:29:55,820 Epoch 53 Step: 11400 Batch Loss: 1.285490 Tokens per Sec: 21294, Lr: 0.000300\n", "2020-02-09 13:30:01,443 Epoch 53: total training loss 316.14\n", "2020-02-09 13:30:01,443 EPOCH 54\n", "2020-02-09 13:30:05,715 Epoch 54 Step: 11500 Batch Loss: 1.644282 Tokens per Sec: 21015, Lr: 0.000300\n", "2020-02-09 13:30:15,582 Epoch 54 Step: 11600 Batch Loss: 1.424573 Tokens per Sec: 21706, Lr: 0.000300\n", "2020-02-09 13:30:22,787 Epoch 54: total training loss 311.70\n", "2020-02-09 13:30:22,788 EPOCH 55\n", "2020-02-09 13:30:25,473 Epoch 55 Step: 11700 Batch Loss: 1.565390 Tokens per Sec: 21123, Lr: 0.000300\n", "2020-02-09 13:30:35,413 Epoch 55 Step: 11800 Batch Loss: 1.487010 Tokens per Sec: 21247, Lr: 0.000300\n", "2020-02-09 13:30:44,186 Epoch 55: total training loss 309.11\n", "2020-02-09 13:30:44,187 EPOCH 56\n", "2020-02-09 13:30:45,313 Epoch 56 Step: 11900 Batch Loss: 1.130963 Tokens per Sec: 20274, Lr: 0.000300\n", "2020-02-09 13:30:55,211 Epoch 56 Step: 12000 Batch Loss: 1.301956 Tokens per Sec: 21966, Lr: 0.000300\n", "2020-02-09 13:31:06,685 Example #0\n", "2020-02-09 13:31:06,685 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 13:31:06,685 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 13:31:06,686 \tHypothesis: Jehovah no let Satan do wetin e no like . E sey : ‘ Everything wey Satan don do for you , na im be the hand of God wey dey give you hand . ’\n", "2020-02-09 13:31:06,686 Example #1\n", "2020-02-09 13:31:06,686 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 13:31:06,686 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 13:31:06,686 \tHypothesis: Corinna come talk sey : “ We comot for where we dey stay for where we dey stay for where we dey stay . We dey stay there for there .\n", "2020-02-09 13:31:06,686 Example #2\n", "2020-02-09 13:31:06,686 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 13:31:06,686 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 13:31:06,686 \tHypothesis: Roldis ( love imself and their family . Dem dey like to love their family people .\n", "2020-02-09 13:31:06,686 Example #3\n", "2020-02-09 13:31:06,687 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 13:31:06,687 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 13:31:06,687 \tHypothesis: Jehovah love us and e know wetin e want make we do .\n", "2020-02-09 13:31:06,687 Validation result (greedy) at epoch 56, step 12000: bleu: 12.89, loss: 50478.4766, ppl: 9.0639, duration: 11.4748s\n", "2020-02-09 13:31:16,584 Epoch 56 Step: 12100 Batch Loss: 1.529229 Tokens per Sec: 21081, Lr: 0.000300\n", "2020-02-09 13:31:17,070 Epoch 56: total training loss 305.48\n", "2020-02-09 13:31:17,070 EPOCH 57\n", "2020-02-09 13:31:26,548 Epoch 57 Step: 12200 Batch Loss: 1.529426 Tokens per Sec: 21076, Lr: 0.000300\n", "2020-02-09 13:31:36,413 Epoch 57 Step: 12300 Batch Loss: 1.737350 Tokens per Sec: 21418, Lr: 0.000300\n", "2020-02-09 13:31:38,570 Epoch 57: total training loss 304.36\n", "2020-02-09 13:31:38,570 EPOCH 58\n", "2020-02-09 13:31:46,348 Epoch 58 Step: 12400 Batch Loss: 1.672396 Tokens per Sec: 21211, Lr: 0.000300\n", "2020-02-09 13:31:56,283 Epoch 58 Step: 12500 Batch Loss: 1.629755 Tokens per Sec: 21376, Lr: 0.000300\n", "2020-02-09 13:32:00,024 Epoch 58: total training loss 300.52\n", "2020-02-09 13:32:00,025 EPOCH 59\n", "2020-02-09 13:32:06,235 Epoch 59 Step: 12600 Batch Loss: 1.617129 Tokens per Sec: 20979, Lr: 0.000300\n", "2020-02-09 13:32:16,261 Epoch 59 Step: 12700 Batch Loss: 1.444259 Tokens per Sec: 21139, Lr: 0.000300\n", "2020-02-09 13:32:21,689 Epoch 59: total training loss 298.62\n", "2020-02-09 13:32:21,689 EPOCH 60\n", "2020-02-09 13:32:26,204 Epoch 60 Step: 12800 Batch Loss: 1.523116 Tokens per Sec: 20963, Lr: 0.000300\n", "2020-02-09 13:32:36,146 Epoch 60 Step: 12900 Batch Loss: 1.396388 Tokens per Sec: 21173, Lr: 0.000300\n", "2020-02-09 13:32:43,263 Epoch 60: total training loss 295.90\n", "2020-02-09 13:32:43,263 EPOCH 61\n", "2020-02-09 13:32:46,087 Epoch 61 Step: 13000 Batch Loss: 0.562521 Tokens per Sec: 19269, Lr: 0.000300\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "2020-02-09 13:33:01,390 Example #0\n", "2020-02-09 13:33:01,390 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 13:33:01,390 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 13:33:01,390 \tHypothesis: Jehovah no let Satan do wetin e no like . E sey : ‘ Everything wey Satan dey do , na im be the true God wey dey give you hand . ’\n", "2020-02-09 13:33:01,390 Example #1\n", "2020-02-09 13:33:01,391 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 13:33:01,391 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 13:33:01,391 \tHypothesis: Corinna come talk sey : “ We comot for where we dey stay for the prison , and we dey stay prison for 25 miles ( 15 km ) .\n", "2020-02-09 13:33:01,391 Example #2\n", "2020-02-09 13:33:01,391 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 13:33:01,391 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 13:33:01,391 \tHypothesis: Romanager wey dey show love , dey enjoy love , and their family people dey enjoy theirself .\n", "2020-02-09 13:33:01,391 Example #3\n", "2020-02-09 13:33:01,391 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 13:33:01,391 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 13:33:01,392 \tHypothesis: ( 1 John 4 : 8 ) As we love Jehovah and the things wey e make , we know sey e really need to really dey happy .\n", "2020-02-09 13:33:01,392 Validation result (greedy) at epoch 61, step 13000: bleu: 12.72, loss: 51401.3750, ppl: 9.4367, duration: 15.3039s\n", "2020-02-09 13:33:11,395 Epoch 61 Step: 13100 Batch Loss: 1.358288 Tokens per Sec: 21458, Lr: 0.000300\n", "2020-02-09 13:33:20,230 Epoch 61: total training loss 294.58\n", "2020-02-09 13:33:20,230 EPOCH 62\n", "2020-02-09 13:33:21,368 Epoch 62 Step: 13200 Batch Loss: 1.458893 Tokens per Sec: 20874, Lr: 0.000300\n", "2020-02-09 13:33:31,321 Epoch 62 Step: 13300 Batch Loss: 1.583393 Tokens per Sec: 21148, Lr: 0.000300\n", "2020-02-09 13:33:41,274 Epoch 62 Step: 13400 Batch Loss: 1.570911 Tokens per Sec: 21341, Lr: 0.000300\n", "2020-02-09 13:33:41,863 Epoch 62: total training loss 290.13\n", "2020-02-09 13:33:41,863 EPOCH 63\n", "2020-02-09 13:33:51,272 Epoch 63 Step: 13500 Batch Loss: 1.357709 Tokens per Sec: 21343, Lr: 0.000300\n", "2020-02-09 13:34:01,250 Epoch 63 Step: 13600 Batch Loss: 1.376045 Tokens per Sec: 20871, Lr: 0.000300\n", "2020-02-09 13:34:03,638 Epoch 63: total training loss 289.45\n", "2020-02-09 13:34:03,638 EPOCH 64\n", "2020-02-09 13:34:11,424 Epoch 64 Step: 13700 Batch Loss: 1.588900 Tokens per Sec: 20835, Lr: 0.000300\n", "2020-02-09 13:34:21,609 Epoch 64 Step: 13800 Batch Loss: 1.281900 Tokens per Sec: 20600, Lr: 0.000300\n", "2020-02-09 13:34:25,639 Epoch 64: total training loss 284.62\n", "2020-02-09 13:34:25,639 EPOCH 65\n", "2020-02-09 13:34:31,810 Epoch 65 Step: 13900 Batch Loss: 1.419438 Tokens per Sec: 20970, Lr: 0.000300\n", "2020-02-09 13:34:41,993 Epoch 65 Step: 14000 Batch Loss: 1.384162 Tokens per Sec: 20759, Lr: 0.000300\n", "2020-02-09 13:34:54,215 Example #0\n", "2020-02-09 13:34:54,215 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 13:34:54,215 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 13:34:54,215 \tHypothesis: Jehovah no let Satan suffer Job . But e let Satan suffer Job . E sey : ‘ Anybody wey get power dey your hand dey your hand . ’\n", "2020-02-09 13:34:54,215 Example #1\n", "2020-02-09 13:34:54,215 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 13:34:54,216 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 13:34:54,216 \tHypothesis: Corinna , wey come from France talk sey : “ We comot for where we dey stay for where we dey stay . We still comot go where dem dey call raton top to go .\n", "2020-02-09 13:34:54,216 Example #2\n", "2020-02-09 13:34:54,216 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 13:34:54,216 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 13:34:54,216 \tHypothesis: Romane , and the kind love wey we get for our family and our family dey make dem happy .\n", "2020-02-09 13:34:54,216 Example #3\n", "2020-02-09 13:34:54,216 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 13:34:54,216 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 13:34:54,216 \tHypothesis: We know sey Jehovah love us and e know wetin we suppose do .\n", "2020-02-09 13:34:54,216 Validation result (greedy) at epoch 65, step 14000: bleu: 13.24, loss: 51182.5859, ppl: 9.3469, duration: 12.2226s\n", "2020-02-09 13:34:59,833 Epoch 65: total training loss 280.51\n", "2020-02-09 13:34:59,833 EPOCH 66\n", "2020-02-09 13:35:04,419 Epoch 66 Step: 14100 Batch Loss: 0.551482 Tokens per Sec: 20886, Lr: 0.000300\n", "2020-02-09 13:35:14,702 Epoch 66 Step: 14200 Batch Loss: 1.378283 Tokens per Sec: 20206, Lr: 0.000300\n", "2020-02-09 13:35:22,014 Epoch 66: total training loss 280.47\n", "2020-02-09 13:35:22,014 EPOCH 67\n", "2020-02-09 13:35:24,923 Epoch 67 Step: 14300 Batch Loss: 1.419129 Tokens per Sec: 19968, Lr: 0.000300\n", "2020-02-09 13:35:35,095 Epoch 67 Step: 14400 Batch Loss: 1.305179 Tokens per Sec: 21055, Lr: 0.000300\n", "2020-02-09 13:35:43,916 Epoch 67: total training loss 276.47\n", "2020-02-09 13:35:43,917 EPOCH 68\n", "2020-02-09 13:35:45,169 Epoch 68 Step: 14500 Batch Loss: 1.365470 Tokens per Sec: 21257, Lr: 0.000300\n", "2020-02-09 13:35:55,142 Epoch 68 Step: 14600 Batch Loss: 1.538778 Tokens per Sec: 21169, Lr: 0.000300\n", "2020-02-09 13:36:05,076 Epoch 68 Step: 14700 Batch Loss: 1.253477 Tokens per Sec: 21499, Lr: 0.000300\n", "2020-02-09 13:36:05,374 Epoch 68: total training loss 273.73\n", "2020-02-09 13:36:05,375 EPOCH 69\n", "2020-02-09 13:36:15,068 Epoch 69 Step: 14800 Batch Loss: 1.171915 Tokens per Sec: 21276, Lr: 0.000300\n", "2020-02-09 13:36:24,979 Epoch 69 Step: 14900 Batch Loss: 1.178302 Tokens per Sec: 21236, Lr: 0.000300\n", "2020-02-09 13:36:26,854 Epoch 69: total training loss 273.45\n", "2020-02-09 13:36:26,855 EPOCH 70\n", "2020-02-09 13:36:34,912 Epoch 70 Step: 15000 Batch Loss: 1.243322 Tokens per Sec: 21426, Lr: 0.000300\n", "2020-02-09 13:36:47,711 Example #0\n", "2020-02-09 13:36:47,711 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 13:36:47,711 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 13:36:47,711 \tHypothesis: Jehovah no let Satan suffer Job . But e let Satan direct Job . E sey : ‘ Anybody wey get power , na im be your hand . ’\n", "2020-02-09 13:36:47,711 Example #1\n", "2020-02-09 13:36:47,712 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 13:36:47,712 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 13:36:47,712 \tHypothesis: Corinna comot for prison , e come sey : “ We comot for where we dey stay for where we dey stay . We waka for there and we waka for there .\n", "2020-02-09 13:36:47,712 Example #2\n", "2020-02-09 13:36:47,712 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 13:36:47,712 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 13:36:47,712 \tHypothesis: Rolroek and family love dey make dem enjoy theirself . Dem dey show love play , and dem love their family pass as dem dey do .\n", "2020-02-09 13:36:47,712 Example #3\n", "2020-02-09 13:36:47,713 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 13:36:47,713 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 13:36:47,713 \tHypothesis: We know sey na Jehovah love us , and e know wetin we suppose do .\n", "2020-02-09 13:36:47,713 Validation result (greedy) at epoch 70, step 15000: bleu: 13.68, loss: 51892.3750, ppl: 9.6412, duration: 12.7998s\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "2020-02-09 13:36:57,564 Epoch 70 Step: 15100 Batch Loss: 1.466820 Tokens per Sec: 21558, Lr: 0.000300\n", "2020-02-09 13:37:00,923 Epoch 70: total training loss 268.52\n", "2020-02-09 13:37:00,923 EPOCH 71\n", "2020-02-09 13:37:07,505 Epoch 71 Step: 15200 Batch Loss: 1.100581 Tokens per Sec: 21126, Lr: 0.000300\n", "2020-02-09 13:37:17,430 Epoch 71 Step: 15300 Batch Loss: 1.263563 Tokens per Sec: 21414, Lr: 0.000300\n", "2020-02-09 13:37:22,304 Epoch 71: total training loss 266.92\n", "2020-02-09 13:37:22,304 EPOCH 72\n", "2020-02-09 13:37:27,420 Epoch 72 Step: 15400 Batch Loss: 1.193709 Tokens per Sec: 20689, Lr: 0.000300\n", "2020-02-09 13:37:37,383 Epoch 72 Step: 15500 Batch Loss: 0.802885 Tokens per Sec: 21147, Lr: 0.000300\n", "2020-02-09 13:37:43,831 Epoch 72: total training loss 266.82\n", "2020-02-09 13:37:43,832 EPOCH 73\n", "2020-02-09 13:37:47,366 Epoch 73 Step: 15600 Batch Loss: 0.445664 Tokens per Sec: 21176, Lr: 0.000300\n", "2020-02-09 13:37:57,363 Epoch 73 Step: 15700 Batch Loss: 1.272757 Tokens per Sec: 21335, Lr: 0.000300\n", "2020-02-09 13:38:05,375 Epoch 73: total training loss 263.86\n", "2020-02-09 13:38:05,376 EPOCH 74\n", "2020-02-09 13:38:07,298 Epoch 74 Step: 15800 Batch Loss: 1.240086 Tokens per Sec: 22479, Lr: 0.000300\n", "2020-02-09 13:38:17,277 Epoch 74 Step: 15900 Batch Loss: 1.174128 Tokens per Sec: 21232, Lr: 0.000300\n", "2020-02-09 13:38:26,900 Epoch 74: total training loss 261.45\n", "2020-02-09 13:38:26,901 EPOCH 75\n", "2020-02-09 13:38:27,244 Epoch 75 Step: 16000 Batch Loss: 0.969944 Tokens per Sec: 19715, Lr: 0.000300\n", "2020-02-09 13:38:39,014 Example #0\n", "2020-02-09 13:38:39,015 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 13:38:39,015 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 13:38:39,015 \tHypothesis: Jehovah no let Job suffer Job . But e let am suffer Job . E sey : ‘ Anybody wey dey your hand dey , na im be the hand of am . ’\n", "2020-02-09 13:38:39,015 Example #1\n", "2020-02-09 13:38:39,015 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 13:38:39,015 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 13:38:39,015 \tHypothesis: Corinna , wey come from prison , talk sey : “ We comot for where we dey stay , and we waka comot go one place wey far well well .\n", "2020-02-09 13:38:39,015 Example #2\n", "2020-02-09 13:38:39,015 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 13:38:39,015 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 13:38:39,015 \tHypothesis: Rometrius ( 1 ) We love people and our family people , we dey enjoy their love . We dey show dem love .\n", "2020-02-09 13:38:39,016 Example #3\n", "2020-02-09 13:38:39,016 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 13:38:39,016 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 13:38:39,016 \tHypothesis: As Jehovah take love us , na im make us dey happy and na im make us dey happy .\n", "2020-02-09 13:38:39,016 Validation result (greedy) at epoch 75, step 16000: bleu: 13.60, loss: 52236.7031, ppl: 9.7872, duration: 11.7712s\n", "2020-02-09 13:38:48,988 Epoch 75 Step: 16100 Batch Loss: 1.245626 Tokens per Sec: 21099, Lr: 0.000210\n", "2020-02-09 13:38:58,902 Epoch 75 Step: 16200 Batch Loss: 1.151787 Tokens per Sec: 21349, Lr: 0.000210\n", "2020-02-09 13:39:00,200 Epoch 75: total training loss 254.01\n", "2020-02-09 13:39:00,201 EPOCH 76\n", "2020-02-09 13:39:08,929 Epoch 76 Step: 16300 Batch Loss: 1.073789 Tokens per Sec: 21096, Lr: 0.000210\n", "2020-02-09 13:39:18,865 Epoch 76 Step: 16400 Batch Loss: 1.273003 Tokens per Sec: 21137, Lr: 0.000210\n", "2020-02-09 13:39:21,841 Epoch 76: total training loss 251.17\n", "2020-02-09 13:39:21,841 EPOCH 77\n", "2020-02-09 13:39:28,804 Epoch 77 Step: 16500 Batch Loss: 1.339782 Tokens per Sec: 21548, Lr: 0.000210\n", "2020-02-09 13:39:38,726 Epoch 77 Step: 16600 Batch Loss: 0.925406 Tokens per Sec: 20585, Lr: 0.000210\n", "2020-02-09 13:39:43,426 Epoch 77: total training loss 249.06\n", "2020-02-09 13:39:43,426 EPOCH 78\n", "2020-02-09 13:39:48,751 Epoch 78 Step: 16700 Batch Loss: 1.370967 Tokens per Sec: 20947, Lr: 0.000210\n", "2020-02-09 13:39:58,731 Epoch 78 Step: 16800 Batch Loss: 1.172879 Tokens per Sec: 21553, Lr: 0.000210\n", "2020-02-09 13:40:05,009 Epoch 78: total training loss 246.34\n", "2020-02-09 13:40:05,009 EPOCH 79\n", "2020-02-09 13:40:08,736 Epoch 79 Step: 16900 Batch Loss: 1.280709 Tokens per Sec: 20360, Lr: 0.000210\n", "2020-02-09 13:40:18,717 Epoch 79 Step: 17000 Batch Loss: 1.374907 Tokens per Sec: 21610, Lr: 0.000210\n", "2020-02-09 13:40:31,020 Example #0\n", "2020-02-09 13:40:31,020 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 13:40:31,020 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 13:40:31,020 \tHypothesis: Jehovah no let Satan suffer Job . But e let Satan do wetin e talk . E sey : ‘ Anybody wey dey use im hand swear for your hand . ’\n", "2020-02-09 13:40:31,020 Example #1\n", "2020-02-09 13:40:31,021 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 13:40:31,021 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 13:40:31,021 \tHypothesis: Corinna , wey come from prison , talk sey : “ We stay for one place wey dem dey call raton . We waka comot go where we dey stay . We waka for there .\n", "2020-02-09 13:40:31,021 Example #2\n", "2020-02-09 13:40:31,021 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 13:40:31,021 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 13:40:31,021 \tHypothesis: Rostroek love dey make people respect each other , and their pikin dey show love . Dem dey care for their family when dem dey chop their pikin when dem dey chop .\n", "2020-02-09 13:40:31,021 Example #3\n", "2020-02-09 13:40:31,021 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 13:40:31,022 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 13:40:31,022 \tHypothesis: We know sey na Jehovah make everything , and wetin e really need .\n", "2020-02-09 13:40:31,022 Validation result (greedy) at epoch 79, step 17000: bleu: 13.63, loss: 52756.6953, ppl: 10.0120, duration: 12.3041s\n", "2020-02-09 13:40:38,846 Epoch 79: total training loss 244.83\n", "2020-02-09 13:40:38,846 EPOCH 80\n", "2020-02-09 13:40:40,980 Epoch 80 Step: 17100 Batch Loss: 1.366554 Tokens per Sec: 20688, Lr: 0.000210\n", "2020-02-09 13:40:50,887 Epoch 80 Step: 17200 Batch Loss: 0.921662 Tokens per Sec: 21148, Lr: 0.000210\n", "2020-02-09 13:41:00,293 Epoch 80: total training loss 243.67\n", "2020-02-09 13:41:00,293 EPOCH 81\n", "2020-02-09 13:41:00,837 Epoch 81 Step: 17300 Batch Loss: 1.227142 Tokens per Sec: 19506, Lr: 0.000210\n", "2020-02-09 13:41:10,745 Epoch 81 Step: 17400 Batch Loss: 1.014282 Tokens per Sec: 21354, Lr: 0.000210\n", "2020-02-09 13:41:20,759 Epoch 81 Step: 17500 Batch Loss: 1.388654 Tokens per Sec: 21215, Lr: 0.000210\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "2020-02-09 13:41:21,836 Epoch 81: total training loss 241.66\n", "2020-02-09 13:41:21,837 EPOCH 82\n", "2020-02-09 13:41:30,702 Epoch 82 Step: 17600 Batch Loss: 1.071036 Tokens per Sec: 21079, Lr: 0.000210\n", "2020-02-09 13:41:40,662 Epoch 82 Step: 17700 Batch Loss: 1.366563 Tokens per Sec: 21398, Lr: 0.000210\n", "2020-02-09 13:41:43,343 Epoch 82: total training loss 239.91\n", "2020-02-09 13:41:43,343 EPOCH 83\n", "2020-02-09 13:41:50,607 Epoch 83 Step: 17800 Batch Loss: 0.447479 Tokens per Sec: 20887, Lr: 0.000210\n", "2020-02-09 13:42:00,555 Epoch 83 Step: 17900 Batch Loss: 1.230339 Tokens per Sec: 21363, Lr: 0.000210\n", "2020-02-09 13:42:04,916 Epoch 83: total training loss 239.55\n", "2020-02-09 13:42:04,916 EPOCH 84\n", "2020-02-09 13:42:10,509 Epoch 84 Step: 18000 Batch Loss: 1.350716 Tokens per Sec: 21147, Lr: 0.000210\n", "2020-02-09 13:42:22,835 Example #0\n", "2020-02-09 13:42:22,836 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 13:42:22,836 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 13:42:22,836 \tHypothesis: Jehovah no let Satan suffer Job . But e let am suffer Job . E tell am sey : ‘ Anybody wey don get power , e don use your hand do wetin e want . ’\n", "2020-02-09 13:42:22,836 Example #1\n", "2020-02-09 13:42:22,836 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 13:42:22,836 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 13:42:22,836 \tHypothesis: Corinna , wey come from prison , talk sey : “ We comot for where we dey stay for where we dey stay . We come comot go where dem dey call raton top wood .\n", "2020-02-09 13:42:22,836 Example #2\n", "2020-02-09 13:42:22,836 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 13:42:22,837 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 13:42:22,837 \tHypothesis: We dey show love , we love dem , and we dey show love dem . We dey show dem love .\n", "2020-02-09 13:42:22,837 Example #3\n", "2020-02-09 13:42:22,837 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 13:42:22,837 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 13:42:22,837 \tHypothesis: As Jehovah love us and e know wetin we need , e know wetin go really make us happy .\n", "2020-02-09 13:42:22,837 Validation result (greedy) at epoch 84, step 18000: bleu: 13.39, loss: 53123.7109, ppl: 10.1738, duration: 12.3279s\n", "2020-02-09 13:42:32,815 Epoch 84 Step: 18100 Batch Loss: 1.229728 Tokens per Sec: 20826, Lr: 0.000210\n", "2020-02-09 13:42:38,961 Epoch 84: total training loss 238.82\n", "2020-02-09 13:42:38,961 EPOCH 85\n", "2020-02-09 13:42:42,782 Epoch 85 Step: 18200 Batch Loss: 1.178186 Tokens per Sec: 20938, Lr: 0.000210\n", "2020-02-09 13:42:52,711 Epoch 85 Step: 18300 Batch Loss: 1.064565 Tokens per Sec: 21595, Lr: 0.000210\n", "2020-02-09 13:43:00,435 Epoch 85: total training loss 235.47\n", "2020-02-09 13:43:00,435 EPOCH 86\n", "2020-02-09 13:43:02,664 Epoch 86 Step: 18400 Batch Loss: 0.958826 Tokens per Sec: 21064, Lr: 0.000210\n", "2020-02-09 13:43:12,630 Epoch 86 Step: 18500 Batch Loss: 1.276298 Tokens per Sec: 20971, Lr: 0.000210\n", "2020-02-09 13:43:22,059 Epoch 86: total training loss 234.78\n", "2020-02-09 13:43:22,059 EPOCH 87\n", "2020-02-09 13:43:22,606 Epoch 87 Step: 18600 Batch Loss: 0.976016 Tokens per Sec: 18135, Lr: 0.000210\n", "2020-02-09 13:43:32,521 Epoch 87 Step: 18700 Batch Loss: 1.222425 Tokens per Sec: 21613, Lr: 0.000210\n", "2020-02-09 13:43:42,430 Epoch 87 Step: 18800 Batch Loss: 1.335187 Tokens per Sec: 21328, Lr: 0.000210\n", "2020-02-09 13:43:43,514 Epoch 87: total training loss 232.08\n", "2020-02-09 13:43:43,515 EPOCH 88\n", "2020-02-09 13:43:52,405 Epoch 88 Step: 18900 Batch Loss: 1.283226 Tokens per Sec: 21084, Lr: 0.000210\n", "2020-02-09 13:44:02,465 Epoch 88 Step: 19000 Batch Loss: 1.182344 Tokens per Sec: 21357, Lr: 0.000210\n", "2020-02-09 13:44:14,514 Example #0\n", "2020-02-09 13:44:14,514 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 13:44:14,514 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 13:44:14,514 \tHypothesis: Jehovah no let Satan suffer Job . But e let am suffer Job . E sey : ‘ Anybody wey get power , na im dey give you hand . ’\n", "2020-02-09 13:44:14,514 Example #1\n", "2020-02-09 13:44:14,514 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 13:44:14,514 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 13:44:14,514 \tHypothesis: Corinna , wey come sey : “ We stay for evening , we come travel go another place wey far well well .\n", "2020-02-09 13:44:14,515 Example #2\n", "2020-02-09 13:44:14,515 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 13:44:14,515 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 13:44:14,515 \tHypothesis: We no dey show love play . We dey enjoy our family , and we dey show love play .\n", "2020-02-09 13:44:14,515 Example #3\n", "2020-02-09 13:44:14,515 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 13:44:14,515 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 13:44:14,515 \tHypothesis: As Jehovah love us , e know wetin e really need and wetin e really need .\n", "2020-02-09 13:44:14,516 Validation result (greedy) at epoch 88, step 19000: bleu: 13.57, loss: 53380.0625, ppl: 10.2883, duration: 12.0500s\n", "2020-02-09 13:44:17,207 Epoch 88: total training loss 230.15\n", "2020-02-09 13:44:17,207 EPOCH 89\n", "2020-02-09 13:44:24,557 Epoch 89 Step: 19100 Batch Loss: 0.548133 Tokens per Sec: 20741, Lr: 0.000210\n", "2020-02-09 13:44:34,597 Epoch 89 Step: 19200 Batch Loss: 0.832117 Tokens per Sec: 21632, Lr: 0.000210\n", "2020-02-09 13:44:38,896 Epoch 89: total training loss 229.20\n", "2020-02-09 13:44:38,896 EPOCH 90\n", "2020-02-09 13:44:44,624 Epoch 90 Step: 19300 Batch Loss: 0.876124 Tokens per Sec: 21552, Lr: 0.000210\n", "2020-02-09 13:44:54,566 Epoch 90 Step: 19400 Batch Loss: 1.264800 Tokens per Sec: 21036, Lr: 0.000210\n", "2020-02-09 13:45:00,410 Epoch 90: total training loss 228.24\n", "2020-02-09 13:45:00,410 EPOCH 91\n", "2020-02-09 13:45:04,549 Epoch 91 Step: 19500 Batch Loss: 1.024360 Tokens per Sec: 20341, Lr: 0.000210\n", "2020-02-09 13:45:14,535 Epoch 91 Step: 19600 Batch Loss: 1.097890 Tokens per Sec: 21405, Lr: 0.000210\n", "2020-02-09 13:45:22,045 Epoch 91: total training loss 227.40\n", "2020-02-09 13:45:22,045 EPOCH 92\n", "2020-02-09 13:45:24,575 Epoch 92 Step: 19700 Batch Loss: 0.919955 Tokens per Sec: 20557, Lr: 0.000210\n", "2020-02-09 13:45:34,523 Epoch 92 Step: 19800 Batch Loss: 1.160814 Tokens per Sec: 21590, Lr: 0.000210\n", "2020-02-09 13:45:43,594 Epoch 92: total training loss 226.09\n", "2020-02-09 13:45:43,594 EPOCH 93\n", "2020-02-09 13:45:44,429 Epoch 93 Step: 19900 Batch Loss: 1.192731 Tokens per Sec: 22762, Lr: 0.000210\n", "2020-02-09 13:45:54,382 Epoch 93 Step: 20000 Batch Loss: 0.361479 Tokens per Sec: 21149, Lr: 0.000210\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "2020-02-09 13:46:07,578 Example #0\n", "2020-02-09 13:46:07,578 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 13:46:07,578 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 13:46:07,578 \tHypothesis: Jehovah no let Satan suffer Job . But e let Satan do wetin e talk . E sey : ‘ Anybody wey get for hand , e don use im hand do wetin e want . ’\n", "2020-02-09 13:46:07,578 Example #1\n", "2020-02-09 13:46:07,578 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 13:46:07,578 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 13:46:07,579 \tHypothesis: Corinna , wey come from prison , talk sey : “ We comot for where we dey stay , and we waka comot go where we dey stay . We waka for there go where dem dey call rakilometers ( 15 miles ) .\n", "2020-02-09 13:46:07,579 Example #2\n", "2020-02-09 13:46:07,579 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 13:46:07,579 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 13:46:07,579 \tHypothesis: We like to marry ( picture . ) We dey show dem love , and we love their family ( Gal .\n", "2020-02-09 13:46:07,579 Example #3\n", "2020-02-09 13:46:07,579 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 13:46:07,579 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 13:46:07,579 \tHypothesis: We know sey Jehovah love us and e know wetin we need .\n", "2020-02-09 13:46:07,579 Validation result (greedy) at epoch 93, step 20000: bleu: 13.91, loss: 53823.9727, ppl: 10.4897, duration: 13.1968s\n", "2020-02-09 13:46:17,536 Epoch 93 Step: 20100 Batch Loss: 1.070482 Tokens per Sec: 21340, Lr: 0.000210\n", "2020-02-09 13:46:18,325 Epoch 93: total training loss 224.44\n", "2020-02-09 13:46:18,325 EPOCH 94\n", "2020-02-09 13:46:27,463 Epoch 94 Step: 20200 Batch Loss: 1.145636 Tokens per Sec: 21201, Lr: 0.000210\n", "2020-02-09 13:46:37,448 Epoch 94 Step: 20300 Batch Loss: 0.872220 Tokens per Sec: 21358, Lr: 0.000210\n", "2020-02-09 13:46:39,841 Epoch 94: total training loss 222.53\n", "2020-02-09 13:46:39,842 EPOCH 95\n", "2020-02-09 13:46:47,500 Epoch 95 Step: 20400 Batch Loss: 1.130787 Tokens per Sec: 20967, Lr: 0.000210\n", "2020-02-09 13:46:57,491 Epoch 95 Step: 20500 Batch Loss: 1.082057 Tokens per Sec: 21452, Lr: 0.000210\n", "2020-02-09 13:47:01,392 Epoch 95: total training loss 221.00\n", "2020-02-09 13:47:01,393 EPOCH 96\n", "2020-02-09 13:47:07,518 Epoch 96 Step: 20600 Batch Loss: 1.183320 Tokens per Sec: 20816, Lr: 0.000210\n", "2020-02-09 13:47:17,572 Epoch 96 Step: 20700 Batch Loss: 1.090420 Tokens per Sec: 21015, Lr: 0.000210\n", "2020-02-09 13:47:23,146 Epoch 96: total training loss 221.56\n", "2020-02-09 13:47:23,147 EPOCH 97\n", "2020-02-09 13:47:27,580 Epoch 97 Step: 20800 Batch Loss: 0.864067 Tokens per Sec: 21545, Lr: 0.000210\n", "2020-02-09 13:47:37,599 Epoch 97 Step: 20900 Batch Loss: 1.084702 Tokens per Sec: 21511, Lr: 0.000210\n", "2020-02-09 13:47:44,692 Epoch 97: total training loss 217.74\n", "2020-02-09 13:47:44,692 EPOCH 98\n", "2020-02-09 13:47:47,636 Epoch 98 Step: 21000 Batch Loss: 0.787406 Tokens per Sec: 20613, Lr: 0.000210\n", "2020-02-09 13:47:59,112 Example #0\n", "2020-02-09 13:47:59,112 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 13:47:59,112 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 13:47:59,112 \tHypothesis: Jehovah no let Satan do wetin e no like . E let Satan do wetin e promise . E sey : ‘ Anybody wey dey your hand don weak , but e never do wetin e want . ’\n", "2020-02-09 13:47:59,112 Example #1\n", "2020-02-09 13:47:59,112 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 13:47:59,113 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 13:47:59,113 \tHypothesis: Corinna , wey come from prison , talk sey : “ We stay for evening and we waka comot . We waka comot for where dem dey call rakilometers ( 15 km ) .\n", "2020-02-09 13:47:59,113 Example #2\n", "2020-02-09 13:47:59,113 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 13:47:59,113 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 13:47:59,113 \tHypothesis: We love Rostroke ( wey dey show our family love , and wey dey show love theirself .\n", "2020-02-09 13:47:59,113 Example #3\n", "2020-02-09 13:47:59,113 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 13:47:59,113 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 13:47:59,113 \tHypothesis: As Jehovah love us , we know wetin e really need and e dey make us happy .\n", "2020-02-09 13:47:59,113 Validation result (greedy) at epoch 98, step 21000: bleu: 13.27, loss: 54286.3945, ppl: 10.7037, duration: 11.4772s\n", "2020-02-09 13:48:09,145 Epoch 98 Step: 21100 Batch Loss: 0.896528 Tokens per Sec: 21080, Lr: 0.000210\n", "2020-02-09 13:48:17,765 Epoch 98: total training loss 217.97\n", "2020-02-09 13:48:17,765 EPOCH 99\n", "2020-02-09 13:48:19,095 Epoch 99 Step: 21200 Batch Loss: 0.963753 Tokens per Sec: 20769, Lr: 0.000210\n", "2020-02-09 13:48:29,002 Epoch 99 Step: 21300 Batch Loss: 1.070642 Tokens per Sec: 21223, Lr: 0.000210\n", "2020-02-09 13:48:38,915 Epoch 99 Step: 21400 Batch Loss: 1.051394 Tokens per Sec: 21467, Lr: 0.000210\n", "2020-02-09 13:48:39,214 Epoch 99: total training loss 216.58\n", "2020-02-09 13:48:39,214 EPOCH 100\n", "2020-02-09 13:48:48,886 Epoch 100 Step: 21500 Batch Loss: 1.141423 Tokens per Sec: 20934, Lr: 0.000210\n", "2020-02-09 13:48:58,818 Epoch 100 Step: 21600 Batch Loss: 1.209352 Tokens per Sec: 21579, Lr: 0.000210\n", "2020-02-09 13:49:00,698 Epoch 100: total training loss 215.87\n", "2020-02-09 13:49:00,698 EPOCH 101\n", "2020-02-09 13:49:08,758 Epoch 101 Step: 21700 Batch Loss: 1.094492 Tokens per Sec: 21748, Lr: 0.000210\n", "2020-02-09 13:49:18,671 Epoch 101 Step: 21800 Batch Loss: 0.616443 Tokens per Sec: 21013, Lr: 0.000210\n", "2020-02-09 13:49:22,246 Epoch 101: total training loss 215.19\n", "2020-02-09 13:49:22,246 EPOCH 102\n", "2020-02-09 13:49:28,668 Epoch 102 Step: 21900 Batch Loss: 1.103251 Tokens per Sec: 20765, Lr: 0.000210\n", "2020-02-09 13:49:38,643 Epoch 102 Step: 22000 Batch Loss: 1.094959 Tokens per Sec: 21129, Lr: 0.000210\n", "2020-02-09 13:49:51,805 Example #0\n", "2020-02-09 13:49:51,806 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 13:49:51,806 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 13:49:51,806 \tHypothesis: Jehovah no let Satan suffer Job . But e let am suffer Job . E tell am sey : ‘ Anybody wey get power , e don use your hand do wetin e want . ’\n", "2020-02-09 13:49:51,806 Example #1\n", "2020-02-09 13:49:51,806 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 13:49:51,806 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 13:49:51,806 \tHypothesis: Corinna , e come sey : “ We comot for where we dey stay , and we waka comot from where we dey stay . We waka for there go there . We dey there to 25 miles ( 15 km ) .\n", "2020-02-09 13:49:51,806 Example #2\n", "2020-02-09 13:49:51,807 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 13:49:51,807 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 13:49:51,807 \tHypothesis: We like to enjoy their picture . We dey enjoy our family , and we dey show love play . We dey show dem how to care for family and when dem dey chop our pikin .\n", "2020-02-09 13:49:51,807 Example #3\n", "2020-02-09 13:49:51,807 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 13:49:51,807 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 13:49:51,807 \tHypothesis: As Jehovah love us and e know wetin we need , e know wetin we need and e dey really make us happy .\n", "2020-02-09 13:49:51,807 Validation result (greedy) at epoch 102, step 22000: bleu: 13.92, loss: 54675.6367, ppl: 10.8871, duration: 13.1641s\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "2020-02-09 13:49:56,998 Epoch 102: total training loss 213.19\n", "2020-02-09 13:49:56,998 EPOCH 103\n", "2020-02-09 13:50:01,802 Epoch 103 Step: 22100 Batch Loss: 1.089305 Tokens per Sec: 21327, Lr: 0.000147\n", "2020-02-09 13:50:11,727 Epoch 103 Step: 22200 Batch Loss: 0.896127 Tokens per Sec: 21282, Lr: 0.000147\n", "2020-02-09 13:50:18,547 Epoch 103: total training loss 207.89\n", "2020-02-09 13:50:18,547 EPOCH 104\n", "2020-02-09 13:50:21,751 Epoch 104 Step: 22300 Batch Loss: 0.938026 Tokens per Sec: 20639, Lr: 0.000147\n", "2020-02-09 13:50:31,744 Epoch 104 Step: 22400 Batch Loss: 1.021022 Tokens per Sec: 21276, Lr: 0.000147\n", "2020-02-09 13:50:40,010 Epoch 104: total training loss 205.96\n", "2020-02-09 13:50:40,011 EPOCH 105\n", "2020-02-09 13:50:41,742 Epoch 105 Step: 22500 Batch Loss: 0.934909 Tokens per Sec: 21536, Lr: 0.000147\n", "2020-02-09 13:50:51,699 Epoch 105 Step: 22600 Batch Loss: 1.191996 Tokens per Sec: 21250, Lr: 0.000147\n", "2020-02-09 13:51:01,456 Epoch 105: total training loss 204.10\n", "2020-02-09 13:51:01,456 EPOCH 106\n", "2020-02-09 13:51:01,701 Epoch 106 Step: 22700 Batch Loss: 1.001169 Tokens per Sec: 19329, Lr: 0.000147\n", "2020-02-09 13:51:11,630 Epoch 106 Step: 22800 Batch Loss: 0.846913 Tokens per Sec: 21166, Lr: 0.000147\n", "2020-02-09 13:51:21,535 Epoch 106 Step: 22900 Batch Loss: 0.811543 Tokens per Sec: 21118, Lr: 0.000147\n", "2020-02-09 13:51:23,030 Epoch 106: total training loss 204.36\n", "2020-02-09 13:51:23,030 EPOCH 107\n", "2020-02-09 13:51:31,487 Epoch 107 Step: 23000 Batch Loss: 0.401998 Tokens per Sec: 21023, Lr: 0.000147\n", "2020-02-09 13:51:43,727 Example #0\n", "2020-02-09 13:51:43,728 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 13:51:43,728 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 13:51:43,728 \tHypothesis: Jehovah no let Satan do wetin e no like . E tell Job sey : ‘ Anybody wey dey your hand don turn to am , but e don already talk sey e don already do wetin you want . ’\n", "2020-02-09 13:51:43,728 Example #1\n", "2020-02-09 13:51:43,728 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 13:51:43,728 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 13:51:43,728 \tHypothesis: Corinna , wey come sey : “ We comot for where we dey stay , we come dey stay for where we dey stay . We waka for there go another place for evening .\n", "2020-02-09 13:51:43,728 Example #2\n", "2020-02-09 13:51:43,729 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 13:51:43,729 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 13:51:43,729 \tHypothesis: We dey show love , family ( 1 ) how we love and our family and our family people dey show love .\n", "2020-02-09 13:51:43,729 Example #3\n", "2020-02-09 13:51:43,729 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 13:51:43,729 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 13:51:43,729 \tHypothesis: As Jehovah take love us , na im make us know wetin we really need and e dey make us happy .\n", "2020-02-09 13:51:43,729 Validation result (greedy) at epoch 107, step 23000: bleu: 14.01, loss: 54995.6445, ppl: 11.0403, duration: 12.2415s\n", "2020-02-09 13:51:53,642 Epoch 107 Step: 23100 Batch Loss: 1.059401 Tokens per Sec: 21711, Lr: 0.000147\n", "2020-02-09 13:51:56,792 Epoch 107: total training loss 203.60\n", "2020-02-09 13:51:56,792 EPOCH 108\n", "2020-02-09 13:52:03,582 Epoch 108 Step: 23200 Batch Loss: 1.087745 Tokens per Sec: 21250, Lr: 0.000147\n", "2020-02-09 13:52:13,496 Epoch 108 Step: 23300 Batch Loss: 1.063015 Tokens per Sec: 21327, Lr: 0.000147\n", "2020-02-09 13:52:18,377 Epoch 108: total training loss 202.56\n", "2020-02-09 13:52:18,377 EPOCH 109\n", "2020-02-09 13:52:23,481 Epoch 109 Step: 23400 Batch Loss: 1.082638 Tokens per Sec: 21526, Lr: 0.000147\n", "2020-02-09 13:52:33,333 Epoch 109 Step: 23500 Batch Loss: 0.983980 Tokens per Sec: 20850, Lr: 0.000147\n", "2020-02-09 13:52:39,747 Epoch 109: total training loss 200.99\n", "2020-02-09 13:52:39,747 EPOCH 110\n", "2020-02-09 13:52:43,235 Epoch 110 Step: 23600 Batch Loss: 1.046511 Tokens per Sec: 21247, Lr: 0.000147\n", "2020-02-09 13:52:53,140 Epoch 110 Step: 23700 Batch Loss: 1.021222 Tokens per Sec: 21363, Lr: 0.000147\n", "2020-02-09 13:53:01,274 Epoch 110: total training loss 200.96\n", "2020-02-09 13:53:01,274 EPOCH 111\n", "2020-02-09 13:53:03,110 Epoch 111 Step: 23800 Batch Loss: 0.839858 Tokens per Sec: 20800, Lr: 0.000147\n", "2020-02-09 13:53:13,121 Epoch 111 Step: 23900 Batch Loss: 1.081153 Tokens per Sec: 21154, Lr: 0.000147\n", "2020-02-09 13:53:22,706 Epoch 111: total training loss 197.80\n", "2020-02-09 13:53:22,707 EPOCH 112\n", "2020-02-09 13:53:23,051 Epoch 112 Step: 24000 Batch Loss: 1.055055 Tokens per Sec: 21954, Lr: 0.000147\n", "2020-02-09 13:53:35,284 Example #0\n", "2020-02-09 13:53:35,284 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 13:53:35,285 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 13:53:35,285 \tHypothesis: Jehovah no let Satan suffer Job . But e let am know sey : ‘ Anybody wey dey your hand don weak , but e never do am . ’\n", "2020-02-09 13:53:35,285 Example #1\n", "2020-02-09 13:53:35,285 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 13:53:35,285 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 13:53:35,285 \tHypothesis: Corinna , wey come sey : “ We stay for where we dey stay , we dey stay there for where we dey stay . We waka for there go another place for evening .\n", "2020-02-09 13:53:35,285 Example #2\n", "2020-02-09 13:53:35,285 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 13:53:35,286 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 13:53:35,286 \tHypothesis: We dey show love play , and we dey enjoy our family . We dey show love when we dey show dem how to love their pikin .\n", "2020-02-09 13:53:35,286 Example #3\n", "2020-02-09 13:53:35,286 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 13:53:35,286 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 13:53:35,286 \tHypothesis: As Jehovah love us , na im make us and wetin we need to really dey happy .\n", "2020-02-09 13:53:35,286 Validation result (greedy) at epoch 112, step 24000: bleu: 13.91, loss: 55179.3203, ppl: 11.1293, duration: 12.2347s\n", "2020-02-09 13:53:45,184 Epoch 112 Step: 24100 Batch Loss: 1.099728 Tokens per Sec: 21260, Lr: 0.000147\n", "2020-02-09 13:53:55,036 Epoch 112 Step: 24200 Batch Loss: 0.993745 Tokens per Sec: 21659, Lr: 0.000147\n", "2020-02-09 13:53:56,220 Epoch 112: total training loss 196.97\n", "2020-02-09 13:53:56,220 EPOCH 113\n", "2020-02-09 13:54:04,936 Epoch 113 Step: 24300 Batch Loss: 0.848847 Tokens per Sec: 21065, Lr: 0.000147\n", "2020-02-09 13:54:14,853 Epoch 113 Step: 24400 Batch Loss: 1.102259 Tokens per Sec: 21254, Lr: 0.000147\n", "2020-02-09 13:54:17,838 Epoch 113: total training loss 198.78\n", "2020-02-09 13:54:17,838 EPOCH 114\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "2020-02-09 13:54:24,836 Epoch 114 Step: 24500 Batch Loss: 1.077271 Tokens per Sec: 21529, Lr: 0.000147\n", "2020-02-09 13:54:34,761 Epoch 114 Step: 24600 Batch Loss: 0.907009 Tokens per Sec: 21087, Lr: 0.000147\n", "2020-02-09 13:54:39,311 Epoch 114: total training loss 196.42\n", "2020-02-09 13:54:39,311 EPOCH 115\n", "2020-02-09 13:54:44,712 Epoch 115 Step: 24700 Batch Loss: 0.819712 Tokens per Sec: 21185, Lr: 0.000147\n", "2020-02-09 13:54:54,651 Epoch 115 Step: 24800 Batch Loss: 0.818831 Tokens per Sec: 21395, Lr: 0.000147\n", "2020-02-09 13:55:00,797 Epoch 115: total training loss 196.22\n", "2020-02-09 13:55:00,798 EPOCH 116\n", "2020-02-09 13:55:04,582 Epoch 116 Step: 24900 Batch Loss: 1.006718 Tokens per Sec: 21151, Lr: 0.000147\n", "2020-02-09 13:55:14,478 Epoch 116 Step: 25000 Batch Loss: 1.004069 Tokens per Sec: 21288, Lr: 0.000147\n", "2020-02-09 13:55:26,365 Example #0\n", "2020-02-09 13:55:26,366 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 13:55:26,366 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 13:55:26,366 \tHypothesis: Jehovah no let Satan suffer Job . But e let am suffer Job . E tell am sey : ‘ Anybody wey get power , na im be that . ’\n", "2020-02-09 13:55:26,366 Example #1\n", "2020-02-09 13:55:26,366 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 13:55:26,366 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 13:55:26,366 \tHypothesis: Corinna , comot for prison , e come sey : “ We stay there for where we dey stay . We waka comot go where dem dey call rakilometers ( 15 km ) .\n", "2020-02-09 13:55:26,366 Example #2\n", "2020-02-09 13:55:26,366 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 13:55:26,367 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 13:55:26,367 \tHypothesis: We like to dey wear love , and we dey enjoy our family pass before .\n", "2020-02-09 13:55:26,367 Example #3\n", "2020-02-09 13:55:26,367 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 13:55:26,367 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 13:55:26,367 \tHypothesis: As Jehovah love us , na im make us dey really happy and e know wetin we need .\n", "2020-02-09 13:55:26,367 Validation result (greedy) at epoch 116, step 25000: bleu: 14.03, loss: 55375.5195, ppl: 11.2250, duration: 11.8887s\n", "2020-02-09 13:55:34,094 Epoch 116: total training loss 195.01\n", "2020-02-09 13:55:34,095 EPOCH 117\n", "2020-02-09 13:55:36,312 Epoch 117 Step: 25100 Batch Loss: 0.982647 Tokens per Sec: 20719, Lr: 0.000147\n", "2020-02-09 13:55:46,223 Epoch 117 Step: 25200 Batch Loss: 1.044865 Tokens per Sec: 21176, Lr: 0.000147\n", "2020-02-09 13:55:55,563 Epoch 117: total training loss 193.82\n", "2020-02-09 13:55:55,563 EPOCH 118\n", "2020-02-09 13:55:56,208 Epoch 118 Step: 25300 Batch Loss: 0.924655 Tokens per Sec: 20524, Lr: 0.000147\n", "2020-02-09 13:56:06,103 Epoch 118 Step: 25400 Batch Loss: 0.868301 Tokens per Sec: 21374, Lr: 0.000147\n", "2020-02-09 13:56:16,158 Epoch 118 Step: 25500 Batch Loss: 1.074172 Tokens per Sec: 21530, Lr: 0.000147\n", "2020-02-09 13:56:16,952 Epoch 118: total training loss 191.56\n", "2020-02-09 13:56:16,953 EPOCH 119\n", "2020-02-09 13:56:26,153 Epoch 119 Step: 25600 Batch Loss: 0.846564 Tokens per Sec: 21172, Lr: 0.000147\n", "2020-02-09 13:56:36,067 Epoch 119 Step: 25700 Batch Loss: 1.155276 Tokens per Sec: 21148, Lr: 0.000147\n", "2020-02-09 13:56:38,532 Epoch 119: total training loss 193.32\n", "2020-02-09 13:56:38,532 EPOCH 120\n", "2020-02-09 13:56:46,027 Epoch 120 Step: 25800 Batch Loss: 0.948716 Tokens per Sec: 20924, Lr: 0.000147\n", "2020-02-09 13:56:55,976 Epoch 120 Step: 25900 Batch Loss: 1.005189 Tokens per Sec: 21534, Lr: 0.000147\n", "2020-02-09 13:57:00,048 Epoch 120: total training loss 191.30\n", "2020-02-09 13:57:00,048 EPOCH 121\n", "2020-02-09 13:57:05,941 Epoch 121 Step: 26000 Batch Loss: 0.358310 Tokens per Sec: 21418, Lr: 0.000147\n", "2020-02-09 13:57:18,531 Example #0\n", "2020-02-09 13:57:18,532 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 13:57:18,532 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 13:57:18,532 \tHypothesis: Jehovah no let Satan suffer Job . But e let am know sey : ‘ Anybody wey e want do , na im dey give am hand . ’\n", "2020-02-09 13:57:18,532 Example #1\n", "2020-02-09 13:57:18,532 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 13:57:18,532 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 13:57:18,532 \tHypothesis: Corinna , wey come from prison , talk sey : “ We comot for where we dey stay . We come comot go where dem dey call rakilometers ( 15 km ) .\n", "2020-02-09 13:57:18,532 Example #2\n", "2020-02-09 13:57:18,533 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 13:57:18,533 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 13:57:18,533 \tHypothesis: We no dey show love play . We dey enjoy to love our family ( 1 ) love dey control ourself when we dey chop our pikin .\n", "2020-02-09 13:57:18,533 Example #3\n", "2020-02-09 13:57:18,533 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 13:57:18,533 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 13:57:18,533 \tHypothesis: As Jehovah love us , na im make us fit really know wetin we need and e dey happy .\n", "2020-02-09 13:57:18,533 Validation result (greedy) at epoch 121, step 26000: bleu: 13.91, loss: 56022.5273, ppl: 11.5467, duration: 12.5917s\n", "2020-02-09 13:57:28,520 Epoch 121 Step: 26100 Batch Loss: 0.563409 Tokens per Sec: 21045, Lr: 0.000147\n", "2020-02-09 13:57:34,308 Epoch 121: total training loss 191.19\n", "2020-02-09 13:57:34,309 EPOCH 122\n", "2020-02-09 13:57:38,539 Epoch 122 Step: 26200 Batch Loss: 1.093610 Tokens per Sec: 21572, Lr: 0.000147\n", "2020-02-09 13:57:48,594 Epoch 122 Step: 26300 Batch Loss: 0.408293 Tokens per Sec: 20788, Lr: 0.000147\n", "2020-02-09 13:57:55,991 Epoch 122: total training loss 190.34\n", "2020-02-09 13:57:55,991 EPOCH 123\n", "2020-02-09 13:57:58,594 Epoch 123 Step: 26400 Batch Loss: 0.932069 Tokens per Sec: 20273, Lr: 0.000147\n", "2020-02-09 13:58:08,571 Epoch 123 Step: 26500 Batch Loss: 0.758683 Tokens per Sec: 21504, Lr: 0.000147\n", "2020-02-09 13:58:17,515 Epoch 123: total training loss 189.80\n", "2020-02-09 13:58:17,515 EPOCH 124\n", "2020-02-09 13:58:18,548 Epoch 124 Step: 26600 Batch Loss: 0.991884 Tokens per Sec: 19099, Lr: 0.000147\n", "2020-02-09 13:58:28,482 Epoch 124 Step: 26700 Batch Loss: 0.715300 Tokens per Sec: 21201, Lr: 0.000147\n", "2020-02-09 13:58:38,437 Epoch 124 Step: 26800 Batch Loss: 0.970540 Tokens per Sec: 21578, Lr: 0.000147\n", "2020-02-09 13:58:39,037 Epoch 124: total training loss 188.82\n", "2020-02-09 13:58:39,037 EPOCH 125\n", "2020-02-09 13:58:48,432 Epoch 125 Step: 26900 Batch Loss: 0.984332 Tokens per Sec: 21287, Lr: 0.000147\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "2020-02-09 13:58:58,405 Epoch 125 Step: 27000 Batch Loss: 0.889066 Tokens per Sec: 21171, Lr: 0.000147\n", "2020-02-09 13:59:11,081 Example #0\n", "2020-02-09 13:59:11,081 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 13:59:11,081 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 13:59:11,081 \tHypothesis: Jehovah no let Satan do wetin e no like . E tell Job sey : ‘ Anybody wey dey your hand don weak . ’\n", "2020-02-09 13:59:11,082 Example #1\n", "2020-02-09 13:59:11,082 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 13:59:11,082 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 13:59:11,082 \tHypothesis: Corinna , wey come from prison , talk sey : “ We stay for where we dey stay . We waka comot from where dem dey call railor . We waka comot go there .\n", "2020-02-09 13:59:11,082 Example #2\n", "2020-02-09 13:59:11,082 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 13:59:11,082 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 13:59:11,082 \tHypothesis: We no dey show love play with our family ( muc·gaʹstation ) and our family people wey dey show love theirself .\n", "2020-02-09 13:59:11,082 Example #3\n", "2020-02-09 13:59:11,082 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 13:59:11,083 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 13:59:11,083 \tHypothesis: As Jehovah love us , e know wetin we need and wetin we really need .\n", "2020-02-09 13:59:11,083 Validation result (greedy) at epoch 125, step 27000: bleu: 14.00, loss: 56223.9844, ppl: 11.6487, duration: 12.6771s\n", "2020-02-09 13:59:13,278 Epoch 125: total training loss 187.16\n", "2020-02-09 13:59:13,279 EPOCH 126\n", "2020-02-09 13:59:21,195 Epoch 126 Step: 27100 Batch Loss: 0.721855 Tokens per Sec: 21094, Lr: 0.000147\n", "2020-02-09 13:59:31,130 Epoch 126 Step: 27200 Batch Loss: 1.002054 Tokens per Sec: 21434, Lr: 0.000147\n", "2020-02-09 13:59:34,703 Epoch 126: total training loss 186.39\n", "2020-02-09 13:59:34,703 EPOCH 127\n", "2020-02-09 13:59:41,080 Epoch 127 Step: 27300 Batch Loss: 0.944836 Tokens per Sec: 21220, Lr: 0.000147\n", "2020-02-09 13:59:50,994 Epoch 127 Step: 27400 Batch Loss: 0.957453 Tokens per Sec: 21689, Lr: 0.000147\n", "2020-02-09 13:59:56,110 Epoch 127: total training loss 185.56\n", "2020-02-09 13:59:56,110 EPOCH 128\n", "2020-02-09 14:00:01,064 Epoch 128 Step: 27500 Batch Loss: 0.765037 Tokens per Sec: 21447, Lr: 0.000147\n", "2020-02-09 14:00:10,996 Epoch 128 Step: 27600 Batch Loss: 0.911300 Tokens per Sec: 21127, Lr: 0.000147\n", "2020-02-09 14:00:17,688 Epoch 128: total training loss 185.93\n", "2020-02-09 14:00:17,688 EPOCH 129\n", "2020-02-09 14:00:21,040 Epoch 129 Step: 27700 Batch Loss: 1.015439 Tokens per Sec: 20892, Lr: 0.000147\n", "2020-02-09 14:00:31,050 Epoch 129 Step: 27800 Batch Loss: 1.008666 Tokens per Sec: 21374, Lr: 0.000147\n", "2020-02-09 14:00:39,172 Epoch 129: total training loss 184.66\n", "2020-02-09 14:00:39,172 EPOCH 130\n", "2020-02-09 14:00:41,003 Epoch 130 Step: 27900 Batch Loss: 0.991815 Tokens per Sec: 21419, Lr: 0.000147\n", "2020-02-09 14:00:50,930 Epoch 130 Step: 28000 Batch Loss: 0.447445 Tokens per Sec: 21173, Lr: 0.000147\n", "2020-02-09 14:01:03,622 Example #0\n", "2020-02-09 14:01:03,622 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 14:01:03,622 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 14:01:03,622 \tHypothesis: Jehovah no let Satan do wetin e no like . E tell Job sey : ‘ Anybody wey dey your hand don turn to your hand . ’\n", "2020-02-09 14:01:03,622 Example #1\n", "2020-02-09 14:01:03,622 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 14:01:03,622 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 14:01:03,623 \tHypothesis: Corinna , wey come from prison , talk sey : “ We stay for where we dey stay . We waka for there and we waka comot go where dem dey call rakilometers ( 15 km ) .\n", "2020-02-09 14:01:03,623 Example #2\n", "2020-02-09 14:01:03,623 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 14:01:03,623 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 14:01:03,623 \tHypothesis: We no dey show this kind love play . We dey enjoy to show dem love , and we dey show dem how their pikin dey dress .\n", "2020-02-09 14:01:03,623 Example #3\n", "2020-02-09 14:01:03,623 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 14:01:03,623 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 14:01:03,623 \tHypothesis: As Jehovah love us and na im make us , e know wetin we need and e dey really make us happy .\n", "2020-02-09 14:01:03,623 Validation result (greedy) at epoch 130, step 28000: bleu: 13.99, loss: 56479.9727, ppl: 11.7797, duration: 12.6928s\n", "2020-02-09 14:01:13,502 Epoch 130: total training loss 185.09\n", "2020-02-09 14:01:13,502 EPOCH 131\n", "2020-02-09 14:01:13,650 Epoch 131 Step: 28100 Batch Loss: 0.641208 Tokens per Sec: 12890, Lr: 0.000103\n", "2020-02-09 14:01:23,705 Epoch 131 Step: 28200 Batch Loss: 0.602730 Tokens per Sec: 21146, Lr: 0.000103\n", "2020-02-09 14:01:33,758 Epoch 131 Step: 28300 Batch Loss: 0.435414 Tokens per Sec: 20979, Lr: 0.000103\n", "2020-02-09 14:01:35,267 Epoch 131: total training loss 180.13\n", "2020-02-09 14:01:35,267 EPOCH 132\n", "2020-02-09 14:01:43,812 Epoch 132 Step: 28400 Batch Loss: 1.007265 Tokens per Sec: 21338, Lr: 0.000103\n", "2020-02-09 14:01:53,749 Epoch 132 Step: 28500 Batch Loss: 0.944787 Tokens per Sec: 21310, Lr: 0.000103\n", "2020-02-09 14:01:56,827 Epoch 132: total training loss 179.81\n", "2020-02-09 14:01:56,827 EPOCH 133\n", "2020-02-09 14:02:03,730 Epoch 133 Step: 28600 Batch Loss: 0.863838 Tokens per Sec: 20838, Lr: 0.000103\n", "2020-02-09 14:02:13,794 Epoch 133 Step: 28700 Batch Loss: 0.739012 Tokens per Sec: 21163, Lr: 0.000103\n", "2020-02-09 14:02:18,407 Epoch 133: total training loss 178.48\n", "2020-02-09 14:02:18,407 EPOCH 134\n", "2020-02-09 14:02:23,828 Epoch 134 Step: 28800 Batch Loss: 0.753290 Tokens per Sec: 20431, Lr: 0.000103\n", "2020-02-09 14:02:33,765 Epoch 134 Step: 28900 Batch Loss: 0.892096 Tokens per Sec: 21273, Lr: 0.000103\n", "2020-02-09 14:02:40,117 Epoch 134: total training loss 179.89\n", "2020-02-09 14:02:40,117 EPOCH 135\n", "2020-02-09 14:02:43,710 Epoch 135 Step: 29000 Batch Loss: 0.874406 Tokens per Sec: 20875, Lr: 0.000103\n", "2020-02-09 14:02:56,862 Example #0\n", "2020-02-09 14:02:56,862 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 14:02:56,862 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 14:02:56,862 \tHypothesis: Jehovah no let Satan do wetin e no like . E tell Job sey : ‘ Anybody wey dey your hand don already talk sey e don already dey your hand . ’\n", "2020-02-09 14:02:56,862 Example #1\n", "2020-02-09 14:02:56,862 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 14:02:56,863 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 14:02:56,863 \tHypothesis: Corinna , wey come from prison , talk sey : “ We comot for where we dey stay , and we dey stay there for where dem dey call rakilometers ( 15 km ) .\n", "2020-02-09 14:02:56,863 Example #2\n", "2020-02-09 14:02:56,863 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 14:02:56,863 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 14:02:56,863 \tHypothesis: We dey show love play , and we dey enjoy our family . We dey enjoy to love ourself when we dey teach our children Bible student .\n", "2020-02-09 14:02:56,863 Example #3\n", "2020-02-09 14:02:56,863 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 14:02:56,863 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 14:02:56,863 \tHypothesis: As Jehovah love us , na im make us know wetin we need and wetin we no fit do .\n", "2020-02-09 14:02:56,863 Validation result (greedy) at epoch 135, step 29000: bleu: 14.12, loss: 56550.9883, ppl: 11.8162, duration: 13.1529s\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "2020-02-09 14:03:06,791 Epoch 135 Step: 29100 Batch Loss: 0.807884 Tokens per Sec: 21170, Lr: 0.000103\n", "2020-02-09 14:03:14,881 Epoch 135: total training loss 178.83\n", "2020-02-09 14:03:14,881 EPOCH 136\n", "2020-02-09 14:03:16,803 Epoch 136 Step: 29200 Batch Loss: 0.718593 Tokens per Sec: 21128, Lr: 0.000103\n", "2020-02-09 14:03:26,802 Epoch 136 Step: 29300 Batch Loss: 0.791574 Tokens per Sec: 20882, Lr: 0.000103\n", "2020-02-09 14:03:36,667 Epoch 136: total training loss 178.04\n", "2020-02-09 14:03:36,667 EPOCH 137\n", "2020-02-09 14:03:36,811 Epoch 137 Step: 29400 Batch Loss: 0.804677 Tokens per Sec: 16783, Lr: 0.000103\n", "2020-02-09 14:03:46,736 Epoch 137 Step: 29500 Batch Loss: 0.883674 Tokens per Sec: 21302, Lr: 0.000103\n", "2020-02-09 14:03:56,702 Epoch 137 Step: 29600 Batch Loss: 0.312471 Tokens per Sec: 21321, Lr: 0.000103\n", "2020-02-09 14:03:58,207 Epoch 137: total training loss 176.31\n", "2020-02-09 14:03:58,207 EPOCH 138\n", "2020-02-09 14:04:06,750 Epoch 138 Step: 29700 Batch Loss: 0.816679 Tokens per Sec: 20878, Lr: 0.000103\n", "2020-02-09 14:04:16,690 Epoch 138 Step: 29800 Batch Loss: 0.350733 Tokens per Sec: 21533, Lr: 0.000103\n", "2020-02-09 14:04:19,777 Epoch 138: total training loss 175.87\n", "2020-02-09 14:04:19,778 EPOCH 139\n", "2020-02-09 14:04:26,668 Epoch 139 Step: 29900 Batch Loss: 0.878610 Tokens per Sec: 20752, Lr: 0.000103\n", "2020-02-09 14:04:36,582 Epoch 139 Step: 30000 Batch Loss: 0.955152 Tokens per Sec: 21934, Lr: 0.000103\n", "2020-02-09 14:04:48,774 Example #0\n", "2020-02-09 14:04:48,774 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 14:04:48,774 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 14:04:48,774 \tHypothesis: Jehovah no let Satan do wetin e no like . E tell Job sey : ‘ Anybody wey dey do wetin e want , na im get the right hand wey you get . ’\n", "2020-02-09 14:04:48,774 Example #1\n", "2020-02-09 14:04:48,775 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 14:04:48,775 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 14:04:48,775 \tHypothesis: Corinna , wey come from prison , talk sey : “ We comot for where we dey stay . We come comot from where we dey stay . We dey there to go preach for where dem dey call rakilometers ( 15 km ) .\n", "2020-02-09 14:04:48,775 Example #2\n", "2020-02-09 14:04:48,775 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 14:04:48,775 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 14:04:48,775 \tHypothesis: We no dey show love play . We dey enjoy to love our family ( 1 ) love ourself when we dey show dem love .\n", "2020-02-09 14:04:48,775 Example #3\n", "2020-02-09 14:04:48,775 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 14:04:48,775 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 14:04:48,775 \tHypothesis: As Jehovah love us , na im make us and e know wetin we really need and e dey make us happy .\n", "2020-02-09 14:04:48,775 Validation result (greedy) at epoch 139, step 30000: bleu: 13.82, loss: 56852.4688, ppl: 11.9728, duration: 12.1930s\n", "2020-02-09 14:04:53,432 Epoch 139: total training loss 175.61\n", "2020-02-09 14:04:53,432 EPOCH 140\n", "2020-02-09 14:04:58,750 Epoch 140 Step: 30100 Batch Loss: 0.935695 Tokens per Sec: 21598, Lr: 0.000103\n", "2020-02-09 14:05:08,709 Epoch 140 Step: 30200 Batch Loss: 0.835252 Tokens per Sec: 21088, Lr: 0.000103\n", "2020-02-09 14:05:14,923 Epoch 140: total training loss 174.31\n", "2020-02-09 14:05:14,924 EPOCH 141\n", "2020-02-09 14:05:18,714 Epoch 141 Step: 30300 Batch Loss: 0.816787 Tokens per Sec: 21864, Lr: 0.000103\n", "2020-02-09 14:05:28,605 Epoch 141 Step: 30400 Batch Loss: 0.801985 Tokens per Sec: 21024, Lr: 0.000103\n", "2020-02-09 14:05:36,393 Epoch 141: total training loss 175.19\n", "2020-02-09 14:05:36,393 EPOCH 142\n", "2020-02-09 14:05:38,499 Epoch 142 Step: 30500 Batch Loss: 0.947238 Tokens per Sec: 20245, Lr: 0.000103\n", "2020-02-09 14:05:48,413 Epoch 142 Step: 30600 Batch Loss: 0.794962 Tokens per Sec: 21181, Lr: 0.000103\n", "2020-02-09 14:05:57,739 Epoch 142: total training loss 173.71\n", "2020-02-09 14:05:57,739 EPOCH 143\n", "2020-02-09 14:05:58,381 Epoch 143 Step: 30700 Batch Loss: 0.692229 Tokens per Sec: 21088, Lr: 0.000103\n", "2020-02-09 14:06:08,357 Epoch 143 Step: 30800 Batch Loss: 0.856696 Tokens per Sec: 21133, Lr: 0.000103\n", "2020-02-09 14:06:18,352 Epoch 143 Step: 30900 Batch Loss: 1.012233 Tokens per Sec: 21072, Lr: 0.000103\n", "2020-02-09 14:06:19,347 Epoch 143: total training loss 172.81\n", "2020-02-09 14:06:19,347 EPOCH 144\n", "2020-02-09 14:06:28,345 Epoch 144 Step: 31000 Batch Loss: 1.026194 Tokens per Sec: 21379, Lr: 0.000103\n", "2020-02-09 14:06:40,857 Example #0\n", "2020-02-09 14:06:40,858 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 14:06:40,858 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 14:06:40,858 \tHypothesis: Jehovah no let Satan do wetin e no like . E tell Job sey : ‘ Anybody wey dey do wetin e want , na im be your hand . ’\n", "2020-02-09 14:06:40,858 Example #1\n", "2020-02-09 14:06:40,858 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 14:06:40,858 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 14:06:40,858 \tHypothesis: Corinna , wey come from prison , talk sey : “ We comot for where we dey stay , and we dey stay there for 25 kilometers ( 15 km ) go there .\n", "2020-02-09 14:06:40,859 Example #2\n", "2020-02-09 14:06:40,859 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 14:06:40,859 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 14:06:40,859 \tHypothesis: We no dey show our family love , and we dey enjoy each other . We dey show this kind love pass when we dey show our pikin .\n", "2020-02-09 14:06:40,859 Example #3\n", "2020-02-09 14:06:40,859 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 14:06:40,859 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 14:06:40,859 \tHypothesis: As Jehovah love us , we know wetin e really need and wetin we no fit do to really happy .\n", "2020-02-09 14:06:40,859 Validation result (greedy) at epoch 144, step 31000: bleu: 13.60, loss: 57068.2891, ppl: 12.0862, duration: 12.5137s\n", "2020-02-09 14:06:50,861 Epoch 144 Step: 31100 Batch Loss: 0.436261 Tokens per Sec: 21107, Lr: 0.000103\n", "2020-02-09 14:06:53,447 Epoch 144: total training loss 172.98\n", "2020-02-09 14:06:53,447 EPOCH 145\n", "2020-02-09 14:07:00,862 Epoch 145 Step: 31200 Batch Loss: 0.888159 Tokens per Sec: 20952, Lr: 0.000103\n", "2020-02-09 14:07:10,816 Epoch 145 Step: 31300 Batch Loss: 0.976383 Tokens per Sec: 21307, Lr: 0.000103\n", "2020-02-09 14:07:14,905 Epoch 145: total training loss 171.71\n", "2020-02-09 14:07:14,906 EPOCH 146\n", "2020-02-09 14:07:20,808 Epoch 146 Step: 31400 Batch Loss: 0.948343 Tokens per Sec: 21440, Lr: 0.000103\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "2020-02-09 14:07:30,746 Epoch 146 Step: 31500 Batch Loss: 0.899355 Tokens per Sec: 21194, Lr: 0.000103\n", "2020-02-09 14:07:36,292 Epoch 146: total training loss 171.23\n", "2020-02-09 14:07:36,292 EPOCH 147\n", "2020-02-09 14:07:40,723 Epoch 147 Step: 31600 Batch Loss: 0.714567 Tokens per Sec: 20765, Lr: 0.000103\n", "2020-02-09 14:07:50,675 Epoch 147 Step: 31700 Batch Loss: 0.963298 Tokens per Sec: 21338, Lr: 0.000103\n", "2020-02-09 14:07:57,909 Epoch 147: total training loss 171.62\n", "2020-02-09 14:07:57,909 EPOCH 148\n", "2020-02-09 14:08:00,636 Epoch 148 Step: 31800 Batch Loss: 0.639810 Tokens per Sec: 21075, Lr: 0.000103\n", "2020-02-09 14:08:10,523 Epoch 148 Step: 31900 Batch Loss: 0.846164 Tokens per Sec: 21358, Lr: 0.000103\n", "2020-02-09 14:08:19,535 Epoch 148: total training loss 171.46\n", "2020-02-09 14:08:19,535 EPOCH 149\n", "2020-02-09 14:08:20,573 Epoch 149 Step: 32000 Batch Loss: 0.518559 Tokens per Sec: 20187, Lr: 0.000103\n", "2020-02-09 14:08:32,965 Example #0\n", "2020-02-09 14:08:32,965 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 14:08:32,965 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 14:08:32,965 \tHypothesis: Jehovah no let Satan do wetin e no like . E tell Job sey : ‘ Anybody wey dey do wetin e want , na im be that . ’\n", "2020-02-09 14:08:32,965 Example #1\n", "2020-02-09 14:08:32,966 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 14:08:32,966 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 14:08:32,966 \tHypothesis: Corinna , wey come from prison , talk sey : “ We stay there for evening and we stay there . We come dey stay there for 25 kilometers ( 15 km ) go there .\n", "2020-02-09 14:08:32,966 Example #2\n", "2020-02-09 14:08:32,966 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 14:08:32,966 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 14:08:32,966 \tHypothesis: We no dey show our family love , and we dey enjoy each other . We dey show love when we dey show dem love .\n", "2020-02-09 14:08:32,966 Example #3\n", "2020-02-09 14:08:32,966 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 14:08:32,966 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 14:08:32,966 \tHypothesis: As Jehovah love us , na wetin make us fit really dey happy and e dey make us happy .\n", "2020-02-09 14:08:32,966 Validation result (greedy) at epoch 149, step 32000: bleu: 13.84, loss: 57246.6211, ppl: 12.1807, duration: 12.3928s\n", "2020-02-09 14:08:42,890 Epoch 149 Step: 32100 Batch Loss: 0.598931 Tokens per Sec: 21004, Lr: 0.000103\n", "2020-02-09 14:08:52,790 Epoch 149 Step: 32200 Batch Loss: 0.738913 Tokens per Sec: 21857, Lr: 0.000103\n", "2020-02-09 14:08:53,473 Epoch 149: total training loss 171.33\n", "2020-02-09 14:08:53,473 EPOCH 150\n", "2020-02-09 14:09:02,722 Epoch 150 Step: 32300 Batch Loss: 0.910482 Tokens per Sec: 21228, Lr: 0.000103\n", "2020-02-09 14:09:12,636 Epoch 150 Step: 32400 Batch Loss: 0.977322 Tokens per Sec: 21299, Lr: 0.000103\n", "2020-02-09 14:09:15,106 Epoch 150: total training loss 171.60\n", "2020-02-09 14:09:15,106 EPOCH 151\n", "2020-02-09 14:09:22,631 Epoch 151 Step: 32500 Batch Loss: 0.694298 Tokens per Sec: 21441, Lr: 0.000103\n", "2020-02-09 14:09:32,506 Epoch 151 Step: 32600 Batch Loss: 0.765342 Tokens per Sec: 21346, Lr: 0.000103\n", "2020-02-09 14:09:36,469 Epoch 151: total training loss 169.09\n", "2020-02-09 14:09:36,469 EPOCH 152\n", "2020-02-09 14:09:42,434 Epoch 152 Step: 32700 Batch Loss: 0.506705 Tokens per Sec: 21083, Lr: 0.000103\n", "2020-02-09 14:09:52,334 Epoch 152 Step: 32800 Batch Loss: 0.951443 Tokens per Sec: 21251, Lr: 0.000103\n", "2020-02-09 14:09:57,979 Epoch 152: total training loss 169.60\n", "2020-02-09 14:09:57,980 EPOCH 153\n", "2020-02-09 14:10:02,257 Epoch 153 Step: 32900 Batch Loss: 0.847457 Tokens per Sec: 21099, Lr: 0.000103\n", "2020-02-09 14:10:12,158 Epoch 153 Step: 33000 Batch Loss: 0.758335 Tokens per Sec: 21422, Lr: 0.000103\n", "2020-02-09 14:10:23,882 Example #0\n", "2020-02-09 14:10:23,882 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 14:10:23,882 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 14:10:23,882 \tHypothesis: Jehovah no let Satan suffer Job . But e no leave Job . Jehovah tell Job sey : ‘ Anybody wey dey do wetin e want , but e never do am . ’\n", "2020-02-09 14:10:23,882 Example #1\n", "2020-02-09 14:10:23,883 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 14:10:23,883 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 14:10:23,883 \tHypothesis: Corinna , wey come from prison , talk sey : “ We stay for where we dey stay . We come dey stay there for 25 kilometers ( 15 km ) go there .\n", "2020-02-09 14:10:23,883 Example #2\n", "2020-02-09 14:10:23,883 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 14:10:23,883 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 14:10:23,883 \tHypothesis: We no dey show love play . We dey enjoy to care for our family ( Matt .\n", "2020-02-09 14:10:23,883 Example #3\n", "2020-02-09 14:10:23,883 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 14:10:23,883 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 14:10:23,884 \tHypothesis: As Jehovah love us and na im make us , e know wetin we need and e dey really make us happy .\n", "2020-02-09 14:10:23,884 Validation result (greedy) at epoch 153, step 33000: bleu: 14.17, loss: 57541.0703, ppl: 12.3383, duration: 11.7248s\n", "2020-02-09 14:10:31,133 Epoch 153: total training loss 168.93\n", "2020-02-09 14:10:31,133 EPOCH 154\n", "2020-02-09 14:10:33,724 Epoch 154 Step: 33100 Batch Loss: 0.773851 Tokens per Sec: 20491, Lr: 0.000103\n", "2020-02-09 14:10:43,585 Epoch 154 Step: 33200 Batch Loss: 0.517397 Tokens per Sec: 21513, Lr: 0.000103\n", "2020-02-09 14:10:52,525 Epoch 154: total training loss 168.41\n", "2020-02-09 14:10:52,525 EPOCH 155\n", "2020-02-09 14:10:53,452 Epoch 155 Step: 33300 Batch Loss: 0.733195 Tokens per Sec: 20157, Lr: 0.000103\n", "2020-02-09 14:11:03,385 Epoch 155 Step: 33400 Batch Loss: 0.783131 Tokens per Sec: 21785, Lr: 0.000103\n", "2020-02-09 14:11:13,375 Epoch 155 Step: 33500 Batch Loss: 0.904125 Tokens per Sec: 20493, Lr: 0.000103\n", "2020-02-09 14:11:14,164 Epoch 155: total training loss 168.12\n", "2020-02-09 14:11:14,164 EPOCH 156\n", "2020-02-09 14:11:23,283 Epoch 156 Step: 33600 Batch Loss: 0.866785 Tokens per Sec: 21310, Lr: 0.000103\n", "2020-02-09 14:11:33,174 Epoch 156 Step: 33700 Batch Loss: 0.948709 Tokens per Sec: 21243, Lr: 0.000103\n", "2020-02-09 14:11:35,752 Epoch 156: total training loss 169.08\n", "2020-02-09 14:11:35,753 EPOCH 157\n", "2020-02-09 14:11:43,115 Epoch 157 Step: 33800 Batch Loss: 0.907475 Tokens per Sec: 21675, Lr: 0.000103\n", "2020-02-09 14:11:53,026 Epoch 157 Step: 33900 Batch Loss: 0.821803 Tokens per Sec: 21435, Lr: 0.000103\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "2020-02-09 14:11:57,207 Epoch 157: total training loss 166.36\n", "2020-02-09 14:11:57,207 EPOCH 158\n", "2020-02-09 14:12:03,045 Epoch 158 Step: 34000 Batch Loss: 0.839570 Tokens per Sec: 21424, Lr: 0.000103\n", "2020-02-09 14:12:15,485 Example #0\n", "2020-02-09 14:12:15,485 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 14:12:15,485 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 14:12:15,485 \tHypothesis: Jehovah no let Satan do wetin e no like . E tell Job sey : ‘ Anybody wey dey do wetin e want , na im be your hand . ’\n", "2020-02-09 14:12:15,486 Example #1\n", "2020-02-09 14:12:15,486 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 14:12:15,486 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 14:12:15,486 \tHypothesis: Corinna , wey come from prison , talk sey : “ We stay there for where we dey stay . We come still comot go where dem dey call rakilometers ( 15 km ) .\n", "2020-02-09 14:12:15,486 Example #2\n", "2020-02-09 14:12:15,486 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 14:12:15,486 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 14:12:15,486 \tHypothesis: We no dey show love play . We dey enjoy our family and our family ( Matt .\n", "2020-02-09 14:12:15,486 Example #3\n", "2020-02-09 14:12:15,486 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 14:12:15,487 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 14:12:15,487 \tHypothesis: As Jehovah love us and na im make us , e know wetin we need and e dey really make us happy .\n", "2020-02-09 14:12:15,487 Validation result (greedy) at epoch 158, step 34000: bleu: 14.23, loss: 57767.6172, ppl: 12.4610, duration: 12.4414s\n", "2020-02-09 14:12:25,433 Epoch 158 Step: 34100 Batch Loss: 0.593047 Tokens per Sec: 21064, Lr: 0.000072\n", "2020-02-09 14:12:31,254 Epoch 158: total training loss 166.39\n", "2020-02-09 14:12:31,255 EPOCH 159\n", "2020-02-09 14:12:35,356 Epoch 159 Step: 34200 Batch Loss: 0.566131 Tokens per Sec: 21527, Lr: 0.000072\n", "2020-02-09 14:12:45,306 Epoch 159 Step: 34300 Batch Loss: 0.323438 Tokens per Sec: 21107, Lr: 0.000072\n", "2020-02-09 14:12:52,781 Epoch 159: total training loss 163.98\n", "2020-02-09 14:12:52,782 EPOCH 160\n", "2020-02-09 14:12:55,289 Epoch 160 Step: 34400 Batch Loss: 0.808073 Tokens per Sec: 20218, Lr: 0.000072\n", "2020-02-09 14:13:05,256 Epoch 160 Step: 34500 Batch Loss: 0.786299 Tokens per Sec: 21327, Lr: 0.000072\n", "2020-02-09 14:13:14,372 Epoch 160: total training loss 163.83\n", "2020-02-09 14:13:14,373 EPOCH 161\n", "2020-02-09 14:13:15,217 Epoch 161 Step: 34600 Batch Loss: 0.875038 Tokens per Sec: 20171, Lr: 0.000072\n", "2020-02-09 14:13:25,164 Epoch 161 Step: 34700 Batch Loss: 0.643738 Tokens per Sec: 21418, Lr: 0.000072\n", "2020-02-09 14:13:35,033 Epoch 161 Step: 34800 Batch Loss: 0.720756 Tokens per Sec: 21346, Lr: 0.000072\n", "2020-02-09 14:13:35,825 Epoch 161: total training loss 163.18\n", "2020-02-09 14:13:35,825 EPOCH 162\n", "2020-02-09 14:13:44,970 Epoch 162 Step: 34900 Batch Loss: 0.716529 Tokens per Sec: 21560, Lr: 0.000072\n", "2020-02-09 14:13:54,844 Epoch 162 Step: 35000 Batch Loss: 0.638049 Tokens per Sec: 21007, Lr: 0.000072\n", "2020-02-09 14:14:06,894 Example #0\n", "2020-02-09 14:14:06,895 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 14:14:06,895 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 14:14:06,895 \tHypothesis: Jehovah no let Satan do wetin e no like . E tell Job sey : ‘ Anybody wey dey your hand don weak , but e never stop to get your hand . ’\n", "2020-02-09 14:14:06,895 Example #1\n", "2020-02-09 14:14:06,895 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 14:14:06,895 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 14:14:06,895 \tHypothesis: Corinna , wey come from prison , talk sey : “ We stay for evening and we dey comot . We come dey stay there for 25 kilometers ( 15 km ) go there .\n", "2020-02-09 14:14:06,895 Example #2\n", "2020-02-09 14:14:06,895 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 14:14:06,895 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 14:14:06,895 \tHypothesis: We no dey show love play . We dey enjoy our family and our family ( 4 ) when we dey show love dem , e dey make dem get better character .\n", "2020-02-09 14:14:06,896 Example #3\n", "2020-02-09 14:14:06,896 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 14:14:06,896 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 14:14:06,896 \tHypothesis: As Jehovah love us and na im make us , e know wetin we need and we no go happy .\n", "2020-02-09 14:14:06,896 Validation result (greedy) at epoch 162, step 35000: bleu: 13.97, loss: 57825.0547, ppl: 12.4923, duration: 12.0512s\n", "2020-02-09 14:14:09,366 Epoch 162: total training loss 162.78\n", "2020-02-09 14:14:09,366 EPOCH 163\n", "2020-02-09 14:14:16,926 Epoch 163 Step: 35100 Batch Loss: 0.860602 Tokens per Sec: 21386, Lr: 0.000072\n", "2020-02-09 14:14:26,798 Epoch 163 Step: 35200 Batch Loss: 0.720456 Tokens per Sec: 21278, Lr: 0.000072\n", "2020-02-09 14:14:30,871 Epoch 163: total training loss 161.83\n", "2020-02-09 14:14:30,871 EPOCH 164\n", "2020-02-09 14:14:36,757 Epoch 164 Step: 35300 Batch Loss: 0.857684 Tokens per Sec: 21221, Lr: 0.000072\n", "2020-02-09 14:14:46,680 Epoch 164 Step: 35400 Batch Loss: 0.476587 Tokens per Sec: 20918, Lr: 0.000072\n", "2020-02-09 14:14:52,433 Epoch 164: total training loss 162.14\n", "2020-02-09 14:14:52,434 EPOCH 165\n", "2020-02-09 14:14:56,642 Epoch 165 Step: 35500 Batch Loss: 0.580762 Tokens per Sec: 21280, Lr: 0.000072\n", "2020-02-09 14:15:06,630 Epoch 165 Step: 35600 Batch Loss: 0.692200 Tokens per Sec: 21404, Lr: 0.000072\n", "2020-02-09 14:15:13,879 Epoch 165: total training loss 160.86\n", "2020-02-09 14:15:13,879 EPOCH 166\n", "2020-02-09 14:15:16,590 Epoch 166 Step: 35700 Batch Loss: 0.814139 Tokens per Sec: 21290, Lr: 0.000072\n", "2020-02-09 14:15:26,529 Epoch 166 Step: 35800 Batch Loss: 0.440191 Tokens per Sec: 21176, Lr: 0.000072\n", "2020-02-09 14:15:35,296 Epoch 166: total training loss 160.49\n", "2020-02-09 14:15:35,296 EPOCH 167\n", "2020-02-09 14:15:36,536 Epoch 167 Step: 35900 Batch Loss: 0.932627 Tokens per Sec: 20986, Lr: 0.000072\n", "2020-02-09 14:15:46,421 Epoch 167 Step: 36000 Batch Loss: 0.772605 Tokens per Sec: 21192, Lr: 0.000072\n", "2020-02-09 14:15:58,696 Example #0\n", "2020-02-09 14:15:58,696 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 14:15:58,696 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 14:15:58,696 \tHypothesis: Jehovah no let Satan do wetin e no like . E tell Job sey : ‘ Anybody wey e get , na im be the Person wey dey use im hand do wetin e want . ’\n", "2020-02-09 14:15:58,697 Example #1\n", "2020-02-09 14:15:58,697 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 14:15:58,697 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 14:15:58,697 \tHypothesis: Corinna , wey come from prison , talk sey : “ We comot for where we dey stay . We come still comot go where we dey go .\n", "2020-02-09 14:15:58,697 Example #2\n", "2020-02-09 14:15:58,697 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 14:15:58,697 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 14:15:58,697 \tHypothesis: We no dey show love play . We dey enjoy to care for our family ( Matt .\n", "2020-02-09 14:15:58,697 Example #3\n", "2020-02-09 14:15:58,697 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 14:15:58,698 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 14:15:58,698 \tHypothesis: As Jehovah love us and na im make us , we know wetin we need and we no go really happy .\n", "2020-02-09 14:15:58,698 Validation result (greedy) at epoch 167, step 36000: bleu: 13.99, loss: 57961.2656, ppl: 12.5668, duration: 12.2759s\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "2020-02-09 14:16:08,591 Epoch 167 Step: 36100 Batch Loss: 0.651248 Tokens per Sec: 21533, Lr: 0.000072\n", "2020-02-09 14:16:08,987 Epoch 167: total training loss 161.08\n", "2020-02-09 14:16:08,988 EPOCH 168\n", "2020-02-09 14:16:18,537 Epoch 168 Step: 36200 Batch Loss: 0.704232 Tokens per Sec: 21435, Lr: 0.000072\n", "2020-02-09 14:16:28,401 Epoch 168 Step: 36300 Batch Loss: 0.684729 Tokens per Sec: 21561, Lr: 0.000072\n", "2020-02-09 14:16:30,265 Epoch 168: total training loss 159.60\n", "2020-02-09 14:16:30,265 EPOCH 169\n", "2020-02-09 14:16:38,278 Epoch 169 Step: 36400 Batch Loss: 0.832892 Tokens per Sec: 21379, Lr: 0.000072\n", "2020-02-09 14:16:48,172 Epoch 169 Step: 36500 Batch Loss: 0.603741 Tokens per Sec: 21275, Lr: 0.000072\n", "2020-02-09 14:16:51,618 Epoch 169: total training loss 160.18\n", "2020-02-09 14:16:51,618 EPOCH 170\n", "2020-02-09 14:16:58,056 Epoch 170 Step: 36600 Batch Loss: 0.663849 Tokens per Sec: 21777, Lr: 0.000072\n", "2020-02-09 14:17:07,895 Epoch 170 Step: 36700 Batch Loss: 0.863176 Tokens per Sec: 21184, Lr: 0.000072\n", "2020-02-09 14:17:13,018 Epoch 170: total training loss 159.95\n", "2020-02-09 14:17:13,018 EPOCH 171\n", "2020-02-09 14:17:17,904 Epoch 171 Step: 36800 Batch Loss: 0.693530 Tokens per Sec: 21226, Lr: 0.000072\n", "2020-02-09 14:17:27,829 Epoch 171 Step: 36900 Batch Loss: 0.840549 Tokens per Sec: 21402, Lr: 0.000072\n", "2020-02-09 14:17:34,547 Epoch 171: total training loss 159.53\n", "2020-02-09 14:17:34,548 EPOCH 172\n", "2020-02-09 14:17:37,752 Epoch 172 Step: 37000 Batch Loss: 0.823699 Tokens per Sec: 22034, Lr: 0.000072\n", "2020-02-09 14:17:50,241 Example #0\n", "2020-02-09 14:17:50,241 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 14:17:50,241 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 14:17:50,242 \tHypothesis: Jehovah no let Satan do wetin e no like . E tell Job sey : ‘ Anybody wey dey your hand don turn to your hand . ’\n", "2020-02-09 14:17:50,242 Example #1\n", "2020-02-09 14:17:50,242 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 14:17:50,242 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 14:17:50,242 \tHypothesis: Corinna , wey come from prison , talk sey : “ We stay for where we dey stay . We come dey stay there for 25 kilometers ( 15 km ) go there .\n", "2020-02-09 14:17:50,242 Example #2\n", "2020-02-09 14:17:50,242 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 14:17:50,242 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 14:17:50,242 \tHypothesis: We no dey show love play . We dey enjoy to care for our family ( Matt .\n", "2020-02-09 14:17:50,242 Example #3\n", "2020-02-09 14:17:50,243 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 14:17:50,243 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 14:17:50,243 \tHypothesis: As Jehovah love us , we know wetin e really need and wetin we no fit do to make am happy .\n", "2020-02-09 14:17:50,243 Validation result (greedy) at epoch 172, step 37000: bleu: 13.90, loss: 58058.8984, ppl: 12.6205, duration: 12.4897s\n", "2020-02-09 14:18:00,155 Epoch 172 Step: 37100 Batch Loss: 0.769521 Tokens per Sec: 21285, Lr: 0.000072\n", "2020-02-09 14:18:08,438 Epoch 172: total training loss 158.29\n", "2020-02-09 14:18:08,438 EPOCH 173\n", "2020-02-09 14:18:10,182 Epoch 173 Step: 37200 Batch Loss: 0.821173 Tokens per Sec: 22753, Lr: 0.000072\n", "2020-02-09 14:18:20,110 Epoch 173 Step: 37300 Batch Loss: 0.305585 Tokens per Sec: 21509, Lr: 0.000072\n", "2020-02-09 14:18:29,890 Epoch 173: total training loss 157.98\n", "2020-02-09 14:18:29,890 EPOCH 174\n", "2020-02-09 14:18:30,136 Epoch 174 Step: 37400 Batch Loss: 0.743520 Tokens per Sec: 18172, Lr: 0.000072\n", "2020-02-09 14:18:40,052 Epoch 174 Step: 37500 Batch Loss: 0.572477 Tokens per Sec: 21162, Lr: 0.000072\n", "2020-02-09 14:18:49,984 Epoch 174 Step: 37600 Batch Loss: 0.582629 Tokens per Sec: 21819, Lr: 0.000072\n", "2020-02-09 14:18:51,292 Epoch 174: total training loss 158.26\n", "2020-02-09 14:18:51,292 EPOCH 175\n", "2020-02-09 14:19:00,037 Epoch 175 Step: 37700 Batch Loss: 0.719113 Tokens per Sec: 21360, Lr: 0.000072\n", "2020-02-09 14:19:10,054 Epoch 175 Step: 37800 Batch Loss: 0.837759 Tokens per Sec: 21285, Lr: 0.000072\n", "2020-02-09 14:19:12,815 Epoch 175: total training loss 157.71\n", "2020-02-09 14:19:12,815 EPOCH 176\n", "2020-02-09 14:19:20,090 Epoch 176 Step: 37900 Batch Loss: 0.820353 Tokens per Sec: 20788, Lr: 0.000072\n", "2020-02-09 14:19:30,138 Epoch 176 Step: 38000 Batch Loss: 0.584996 Tokens per Sec: 21018, Lr: 0.000072\n", "2020-02-09 14:19:42,516 Example #0\n", "2020-02-09 14:19:42,517 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 14:19:42,517 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 14:19:42,517 \tHypothesis: Jehovah no let Satan suffer Job . But e no leave wetin e want . Jehovah tell Job sey : ‘ Anybody wey get power , but e never do am . ’\n", "2020-02-09 14:19:42,517 Example #1\n", "2020-02-09 14:19:42,517 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 14:19:42,517 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 14:19:42,517 \tHypothesis: Corinna , wey come from prison , talk sey : “ We stay for where we dey stay . We come still dey stay there for 25 kilometers ( 15 km ) go there .\n", "2020-02-09 14:19:42,517 Example #2\n", "2020-02-09 14:19:42,517 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 14:19:42,517 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 14:19:42,517 \tHypothesis: We no dey show love play . We dey enjoy to care for our family ( Matt .\n", "2020-02-09 14:19:42,518 Example #3\n", "2020-02-09 14:19:42,518 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 14:19:42,518 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 14:19:42,518 \tHypothesis: As Jehovah love us , we know wetin e really need and wetin we no fit do to make am happy .\n", "2020-02-09 14:19:42,518 Validation result (greedy) at epoch 176, step 38000: bleu: 13.88, loss: 58165.6484, ppl: 12.6795, duration: 12.3789s\n", "2020-02-09 14:19:46,882 Epoch 176: total training loss 157.69\n", "2020-02-09 14:19:46,882 EPOCH 177\n", "2020-02-09 14:19:52,491 Epoch 177 Step: 38100 Batch Loss: 0.839295 Tokens per Sec: 21240, Lr: 0.000072\n", "2020-02-09 14:20:02,412 Epoch 177 Step: 38200 Batch Loss: 0.867415 Tokens per Sec: 21135, Lr: 0.000072\n", "2020-02-09 14:20:08,444 Epoch 177: total training loss 157.32\n", "2020-02-09 14:20:08,445 EPOCH 178\n", "2020-02-09 14:20:12,471 Epoch 178 Step: 38300 Batch Loss: 0.684579 Tokens per Sec: 20472, Lr: 0.000072\n", "2020-02-09 14:20:22,483 Epoch 178 Step: 38400 Batch Loss: 0.888620 Tokens per Sec: 21269, Lr: 0.000072\n", "2020-02-09 14:20:30,008 Epoch 178: total training loss 157.04\n", "2020-02-09 14:20:30,008 EPOCH 179\n", "2020-02-09 14:20:32,406 Epoch 179 Step: 38500 Batch Loss: 0.747984 Tokens per Sec: 20546, Lr: 0.000072\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "2020-02-09 14:20:42,318 Epoch 179 Step: 38600 Batch Loss: 0.880630 Tokens per Sec: 21455, Lr: 0.000072\n", "2020-02-09 14:20:51,587 Epoch 179: total training loss 158.21\n", "2020-02-09 14:20:51,587 EPOCH 180\n", "2020-02-09 14:20:52,223 Epoch 180 Step: 38700 Batch Loss: 0.715024 Tokens per Sec: 19638, Lr: 0.000072\n", "2020-02-09 14:21:02,175 Epoch 180 Step: 38800 Batch Loss: 0.700444 Tokens per Sec: 21450, Lr: 0.000072\n", "2020-02-09 14:21:12,087 Epoch 180 Step: 38900 Batch Loss: 0.672916 Tokens per Sec: 21168, Lr: 0.000072\n", "2020-02-09 14:21:13,071 Epoch 180: total training loss 155.96\n", "2020-02-09 14:21:13,072 EPOCH 181\n", "2020-02-09 14:21:22,093 Epoch 181 Step: 39000 Batch Loss: 0.338834 Tokens per Sec: 21004, Lr: 0.000072\n", "2020-02-09 14:21:34,349 Example #0\n", "2020-02-09 14:21:34,349 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 14:21:34,349 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 14:21:34,349 \tHypothesis: Jehovah no let Satan do wetin e no like . E tell Job sey : ‘ Anybody wey dey do wetin e want , na im be your hand . ’\n", "2020-02-09 14:21:34,349 Example #1\n", "2020-02-09 14:21:34,350 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 14:21:34,350 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 14:21:34,350 \tHypothesis: Corinna , wey come from prison , talk sey : ‘ We comot for where we dey stay . We come comot go where we dey go . We dey stay for there for 15 kilometers ( 15 km ) .\n", "2020-02-09 14:21:34,350 Example #2\n", "2020-02-09 14:21:34,350 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 14:21:34,350 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 14:21:34,350 \tHypothesis: We no dey like to play . We dey enjoy to show love , and our family ( 4 ) when we dey show dem love .\n", "2020-02-09 14:21:34,350 Example #3\n", "2020-02-09 14:21:34,350 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 14:21:34,350 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 14:21:34,351 \tHypothesis: As Jehovah love us , na im make us and wetin we need to really dey happy .\n", "2020-02-09 14:21:34,351 Validation result (greedy) at epoch 181, step 39000: bleu: 14.05, loss: 58317.5391, ppl: 12.7639, duration: 12.2573s\n", "2020-02-09 14:21:44,280 Epoch 181 Step: 39100 Batch Loss: 0.770275 Tokens per Sec: 21262, Lr: 0.000072\n", "2020-02-09 14:21:46,863 Epoch 181: total training loss 155.99\n", "2020-02-09 14:21:46,864 EPOCH 182\n", "2020-02-09 14:21:54,235 Epoch 182 Step: 39200 Batch Loss: 0.227631 Tokens per Sec: 21109, Lr: 0.000072\n", "2020-02-09 14:22:04,147 Epoch 182 Step: 39300 Batch Loss: 0.759427 Tokens per Sec: 21534, Lr: 0.000072\n", "2020-02-09 14:22:08,424 Epoch 182: total training loss 156.69\n", "2020-02-09 14:22:08,424 EPOCH 183\n", "2020-02-09 14:22:14,115 Epoch 183 Step: 39400 Batch Loss: 0.530501 Tokens per Sec: 21576, Lr: 0.000072\n", "2020-02-09 14:22:24,037 Epoch 183 Step: 39500 Batch Loss: 0.611335 Tokens per Sec: 20774, Lr: 0.000072\n", "2020-02-09 14:22:30,008 Epoch 183: total training loss 156.20\n", "2020-02-09 14:22:30,008 EPOCH 184\n", "2020-02-09 14:22:34,037 Epoch 184 Step: 39600 Batch Loss: 0.632424 Tokens per Sec: 20706, Lr: 0.000072\n", "2020-02-09 14:22:44,046 Epoch 184 Step: 39700 Batch Loss: 0.458936 Tokens per Sec: 21489, Lr: 0.000072\n", "2020-02-09 14:22:51,656 Epoch 184: total training loss 156.09\n", "2020-02-09 14:22:51,656 EPOCH 185\n", "2020-02-09 14:22:54,090 Epoch 185 Step: 39800 Batch Loss: 0.359203 Tokens per Sec: 20522, Lr: 0.000072\n", "2020-02-09 14:23:04,002 Epoch 185 Step: 39900 Batch Loss: 0.567300 Tokens per Sec: 21396, Lr: 0.000072\n", "2020-02-09 14:23:13,377 Epoch 185: total training loss 156.37\n", "2020-02-09 14:23:13,378 EPOCH 186\n", "2020-02-09 14:23:14,013 Epoch 186 Step: 40000 Batch Loss: 0.797387 Tokens per Sec: 19579, Lr: 0.000072\n", "2020-02-09 14:23:26,217 Example #0\n", "2020-02-09 14:23:26,217 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 14:23:26,217 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 14:23:26,217 \tHypothesis: Jehovah no let Satan do wetin e no like . E tell Job sey : ‘ Anybody wey dey do wetin e want , na im be your hand . ’\n", "2020-02-09 14:23:26,217 Example #1\n", "2020-02-09 14:23:26,217 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 14:23:26,218 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 14:23:26,218 \tHypothesis: Corinna , wey come from prison , talk sey : ‘ We comot for where we dey stay . We stay there for 25 kilometers ( 15 km ) go there .\n", "2020-02-09 14:23:26,218 Example #2\n", "2020-02-09 14:23:26,218 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 14:23:26,218 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 14:23:26,218 \tHypothesis: We no dey show love play . We dey enjoy to love dem ( other people ) when dem dey show love their family and other things .\n", "2020-02-09 14:23:26,218 Example #3\n", "2020-02-09 14:23:26,218 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 14:23:26,218 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 14:23:26,218 \tHypothesis: As Jehovah love us and na im make us , e know wetin we need and wetin we really need .\n", "2020-02-09 14:23:26,218 Validation result (greedy) at epoch 186, step 40000: bleu: 14.23, loss: 58420.3828, ppl: 12.8213, duration: 12.2052s\n", "2020-02-09 14:23:36,067 Epoch 186 Step: 40100 Batch Loss: 0.371000 Tokens per Sec: 21901, Lr: 0.000050\n", "2020-02-09 14:23:45,919 Epoch 186 Step: 40200 Batch Loss: 0.779848 Tokens per Sec: 21101, Lr: 0.000050\n", "2020-02-09 14:23:46,902 Epoch 186: total training loss 153.79\n", "2020-02-09 14:23:46,902 EPOCH 187\n", "2020-02-09 14:23:55,806 Epoch 187 Step: 40300 Batch Loss: 0.666796 Tokens per Sec: 21358, Lr: 0.000050\n", "2020-02-09 14:24:05,679 Epoch 187 Step: 40400 Batch Loss: 0.693041 Tokens per Sec: 21568, Lr: 0.000050\n", "2020-02-09 14:24:08,231 Epoch 187: total training loss 153.45\n", "2020-02-09 14:24:08,231 EPOCH 188\n", "2020-02-09 14:24:15,587 Epoch 188 Step: 40500 Batch Loss: 0.745225 Tokens per Sec: 21125, Lr: 0.000050\n", "2020-02-09 14:24:25,460 Epoch 188 Step: 40600 Batch Loss: 0.297083 Tokens per Sec: 21297, Lr: 0.000050\n", "2020-02-09 14:24:29,691 Epoch 188: total training loss 153.67\n", "2020-02-09 14:24:29,692 EPOCH 189\n", "2020-02-09 14:24:35,369 Epoch 189 Step: 40700 Batch Loss: 0.654812 Tokens per Sec: 21283, Lr: 0.000050\n", "2020-02-09 14:24:45,250 Epoch 189 Step: 40800 Batch Loss: 0.321035 Tokens per Sec: 21423, Lr: 0.000050\n", "2020-02-09 14:24:51,077 Epoch 189: total training loss 152.52\n", "2020-02-09 14:24:51,078 EPOCH 190\n", "2020-02-09 14:24:55,157 Epoch 190 Step: 40900 Batch Loss: 0.579763 Tokens per Sec: 20671, Lr: 0.000050\n", "2020-02-09 14:25:05,078 Epoch 190 Step: 41000 Batch Loss: 0.804572 Tokens per Sec: 21245, Lr: 0.000050\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "2020-02-09 14:25:17,237 Example #0\n", "2020-02-09 14:25:17,238 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 14:25:17,238 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 14:25:17,238 \tHypothesis: Jehovah no let Satan do wetin e no like . E tell Job sey : ‘ Anybody wey dey do wetin e want , na im be the true God . ’\n", "2020-02-09 14:25:17,238 Example #1\n", "2020-02-09 14:25:17,238 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 14:25:17,238 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 14:25:17,238 \tHypothesis: Corinna , e come sey : ‘ We comot for where we dey stay . We dey stay there for where we dey go . We dey go preach for there .\n", "2020-02-09 14:25:17,238 Example #2\n", "2020-02-09 14:25:17,238 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 14:25:17,238 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 14:25:17,238 \tHypothesis: We no dey show love to enjoy our family ( adcas ) when we dey show love play our pikin , and we dey show our children love .\n", "2020-02-09 14:25:17,239 Example #3\n", "2020-02-09 14:25:17,239 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 14:25:17,239 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 14:25:17,239 \tHypothesis: As Jehovah love us , na im make us and wetin we need to really dey happy .\n", "2020-02-09 14:25:17,239 Validation result (greedy) at epoch 190, step 41000: bleu: 13.84, loss: 58521.4727, ppl: 12.8780, duration: 12.1607s\n", "2020-02-09 14:25:24,905 Epoch 190: total training loss 154.18\n", "2020-02-09 14:25:24,905 EPOCH 191\n", "2020-02-09 14:25:27,111 Epoch 191 Step: 41100 Batch Loss: 0.751742 Tokens per Sec: 20157, Lr: 0.000050\n", "2020-02-09 14:25:36,981 Epoch 191 Step: 41200 Batch Loss: 0.704487 Tokens per Sec: 21397, Lr: 0.000050\n", "2020-02-09 14:25:46,267 Epoch 191: total training loss 152.09\n", "2020-02-09 14:25:46,268 EPOCH 192\n", "2020-02-09 14:25:46,910 Epoch 192 Step: 41300 Batch Loss: 0.321632 Tokens per Sec: 18733, Lr: 0.000050\n", "2020-02-09 14:25:56,826 Epoch 192 Step: 41400 Batch Loss: 0.707680 Tokens per Sec: 21317, Lr: 0.000050\n", "2020-02-09 14:26:06,661 Epoch 192 Step: 41500 Batch Loss: 0.677484 Tokens per Sec: 21201, Lr: 0.000050\n", "2020-02-09 14:26:07,841 Epoch 192: total training loss 153.01\n", "2020-02-09 14:26:07,841 EPOCH 193\n", "2020-02-09 14:26:16,621 Epoch 193 Step: 41600 Batch Loss: 0.795605 Tokens per Sec: 21411, Lr: 0.000050\n", "2020-02-09 14:26:26,461 Epoch 193 Step: 41700 Batch Loss: 0.748809 Tokens per Sec: 21260, Lr: 0.000050\n", "2020-02-09 14:26:29,311 Epoch 193: total training loss 152.26\n", "2020-02-09 14:26:29,311 EPOCH 194\n", "2020-02-09 14:26:36,362 Epoch 194 Step: 41800 Batch Loss: 0.730199 Tokens per Sec: 20853, Lr: 0.000050\n", "2020-02-09 14:26:46,246 Epoch 194 Step: 41900 Batch Loss: 0.782122 Tokens per Sec: 21153, Lr: 0.000050\n", "2020-02-09 14:26:50,892 Epoch 194: total training loss 151.95\n", "2020-02-09 14:26:50,893 EPOCH 195\n", "2020-02-09 14:26:56,155 Epoch 195 Step: 42000 Batch Loss: 0.748613 Tokens per Sec: 21127, Lr: 0.000050\n", "2020-02-09 14:27:08,746 Example #0\n", "2020-02-09 14:27:08,746 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 14:27:08,746 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 14:27:08,746 \tHypothesis: Jehovah no let Satan do . E no leave Job . But e do wetin Satan talk . E sey : ‘ Anybody wey get power , e don use your hand do wetin e want . ’\n", "2020-02-09 14:27:08,746 Example #1\n", "2020-02-09 14:27:08,747 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 14:27:08,747 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 14:27:08,747 \tHypothesis: Corinna , wey come from prison , talk sey : ‘ We stay for where we dey stay . We come dey stay there for 25 kilometers ( 15 km ) go where dem dey call us go .\n", "2020-02-09 14:27:08,747 Example #2\n", "2020-02-09 14:27:08,747 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 14:27:08,747 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 14:27:08,747 \tHypothesis: We no dey show love play . We dey enjoy to care for our family ( Eph . 4 : 5 ) We go still show dem love , and our pikin .\n", "2020-02-09 14:27:08,747 Example #3\n", "2020-02-09 14:27:08,748 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 14:27:08,748 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 14:27:08,748 \tHypothesis: As Jehovah love us and na im make us , e know wetin we need and e dey really make us happy .\n", "2020-02-09 14:27:08,748 Validation result (greedy) at epoch 195, step 42000: bleu: 13.85, loss: 58654.3438, ppl: 12.9530, duration: 12.5924s\n", "2020-02-09 14:27:18,631 Epoch 195 Step: 42100 Batch Loss: 0.613583 Tokens per Sec: 20873, Lr: 0.000050\n", "2020-02-09 14:27:25,049 Epoch 195: total training loss 151.98\n", "2020-02-09 14:27:25,049 EPOCH 196\n", "2020-02-09 14:27:28,526 Epoch 196 Step: 42200 Batch Loss: 0.743989 Tokens per Sec: 21417, Lr: 0.000050\n", "2020-02-09 14:27:38,351 Epoch 196 Step: 42300 Batch Loss: 0.844665 Tokens per Sec: 21317, Lr: 0.000050\n", "2020-02-09 14:27:46,325 Epoch 196: total training loss 151.28\n", "2020-02-09 14:27:46,325 EPOCH 197\n", "2020-02-09 14:27:48,219 Epoch 197 Step: 42400 Batch Loss: 0.708702 Tokens per Sec: 20393, Lr: 0.000050\n", "2020-02-09 14:27:58,071 Epoch 197 Step: 42500 Batch Loss: 0.415755 Tokens per Sec: 21924, Lr: 0.000050\n", "2020-02-09 14:28:07,718 Epoch 197: total training loss 151.09\n", "2020-02-09 14:28:07,718 EPOCH 198\n", "2020-02-09 14:28:07,963 Epoch 198 Step: 42600 Batch Loss: 0.825495 Tokens per Sec: 19112, Lr: 0.000050\n", "2020-02-09 14:28:17,795 Epoch 198 Step: 42700 Batch Loss: 0.843647 Tokens per Sec: 21215, Lr: 0.000050\n", "2020-02-09 14:28:27,679 Epoch 198 Step: 42800 Batch Loss: 0.833294 Tokens per Sec: 21857, Lr: 0.000050\n", "2020-02-09 14:28:29,056 Epoch 198: total training loss 150.54\n", "2020-02-09 14:28:29,057 EPOCH 199\n", "2020-02-09 14:28:37,595 Epoch 199 Step: 42900 Batch Loss: 0.795709 Tokens per Sec: 20923, Lr: 0.000050\n", "2020-02-09 14:28:47,518 Epoch 199 Step: 43000 Batch Loss: 0.664702 Tokens per Sec: 21448, Lr: 0.000050\n", "2020-02-09 14:28:59,836 Example #0\n", "2020-02-09 14:28:59,836 \tSource: Jehovah did not do that , but he allowed Satan to test Job , stating : “ Everything that he has is in your hand . ”\n", "2020-02-09 14:28:59,836 \tReference: E tell am sey : ‘ Everything wey e get dey your hand . ’\n", "2020-02-09 14:28:59,837 \tHypothesis: Jehovah no let Satan do wetin e no like . E tell Job sey : ‘ Anybody wey dey do wetin e want , na im be your hand . ’\n", "2020-02-09 14:28:59,837 Example #1\n", "2020-02-09 14:28:59,837 \tSource: Corinna said : “ We left our work area in the evening and walked to a railway station 25 kilometers ( 15 miles ) away .\n", "2020-02-09 14:28:59,837 \tReference: Corinna talk sey : “ We comot for where we dey work for evening come trek go where people dey enter train wey be 25 kilometer ( 15 miles ) from where the farm dey .\n", "2020-02-09 14:28:59,837 \tHypothesis: Corinna , wey come from prison , talk sey : ‘ We stay for where we dey stay . We come still dey comot for where dem dey call rakilometers ( 15 km ) go preach .\n", "2020-02-09 14:28:59,837 Example #2\n", "2020-02-09 14:28:59,837 \tSource: Romantic love ( eʹros ) brings delight , and love for family ( stor·geʹ ) is vital when children enter the picture .\n", "2020-02-09 14:28:59,837 \tReference: ( 1 ) Love wey friend get for each other ( Greek , phi·liʹa ) ; ( 2 ) love wey man and woman get for each other ( eʹros ) ; ( 3 ) love wey people for family get for each other ( stor·geʹ ) .\n", "2020-02-09 14:28:59,837 \tHypothesis: We no dey show love play . We dey enjoy to care for our family ( 4 ) when we dey show love their children .\n", "2020-02-09 14:28:59,837 Example #3\n", "2020-02-09 14:28:59,838 \tSource: As our loving Designer and Creator , Jehovah knows what we need in order to be truly happy , and he fills that need abundantly .\n", "2020-02-09 14:28:59,838 \tReference: Because na Jehovah make us , e know wetin go make us happy . E don even give us pass wetin we need .\n", "2020-02-09 14:28:59,838 \tHypothesis: As Jehovah love us and na this one make us sabi wetin we need , we dey really happy .\n", "2020-02-09 14:28:59,838 Validation result (greedy) at epoch 199, step 43000: bleu: 14.20, loss: 58695.1172, ppl: 12.9761, duration: 12.3193s\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "2020-02-09 14:29:02,928 Epoch 199: total training loss 150.54\n", "2020-02-09 14:29:02,928 EPOCH 200\n", "2020-02-09 14:29:09,841 Epoch 200 Step: 43100 Batch Loss: 0.733609 Tokens per Sec: 21135, Lr: 0.000050\n", "2020-02-09 14:29:19,853 Epoch 200 Step: 43200 Batch Loss: 0.636086 Tokens per Sec: 21286, Lr: 0.000050\n", "2020-02-09 14:29:24,549 Epoch 200: total training loss 149.87\n", "2020-02-09 14:29:24,550 Training ended after 200 epochs.\n", "2020-02-09 14:29:24,550 Best validation result (greedy) at step 10000: 8.84 ppl.\n", "2020-02-09 14:29:39,550 dev bleu: 12.78 [Beam search decoding with beam size = 5 and alpha = 1.0]\n", "2020-02-09 14:29:39,551 Translations saved to: models/enpcm_transformer/00010000.hyps.dev\n", "2020-02-09 14:30:02,666 test bleu: 24.29 [Beam search decoding with beam size = 5 and alpha = 1.0]\n", "2020-02-09 14:30:02,667 Translations saved to: models/enpcm_transformer/00010000.hyps.test\n" ] } ], "source": [ "# Train the model\n", "# You can press Ctrl-C to stop. And then run the next cell to save your checkpoints! \n", "!cd joeynmt; python3 -m joeynmt train configs/transformer_$src$tgt.yaml" ] }, { "cell_type": "code", "execution_count": 23, "metadata": {}, "outputs": [], "source": [ "!mkdir -p \"$experiment_path/models/${src}${tgt}_transformer/\"" ] }, { "cell_type": "code", "execution_count": 24, "metadata": { "colab": {}, "colab_type": "code", "id": "MBoDS09JM807" }, "outputs": [], "source": [ "# Copy the created models from the notebook storage to google drive for persistant storage \n", "!cp -r joeynmt/models/${src}${tgt}_transformer/* \"$experiment_path/models/${src}${tgt}_transformer/\"" ] }, { "cell_type": "code", "execution_count": 25, "metadata": { "colab": {}, "colab_type": "code", "id": "n94wlrCjVc17" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Steps: 1000\tLoss: 73394.23438\tPPL: 24.65521\tbleu: 2.06072\tLR: 0.00030000\t*\r\n", "Steps: 2000\tLoss: 62560.94141\tPPL: 15.36234\tbleu: 4.45320\tLR: 0.00030000\t*\r\n", "Steps: 3000\tLoss: 56698.69531\tPPL: 11.89271\tbleu: 7.73914\tLR: 0.00030000\t*\r\n", "Steps: 4000\tLoss: 53580.67188\tPPL: 10.37882\tbleu: 8.94406\tLR: 0.00030000\t*\r\n", "Steps: 5000\tLoss: 51855.69531\tPPL: 9.62574\tbleu: 10.11587\tLR: 0.00030000\t*\r\n", "Steps: 6000\tLoss: 50861.65234\tPPL: 9.21685\tbleu: 11.02907\tLR: 0.00030000\t*\r\n", "Steps: 7000\tLoss: 50434.14062\tPPL: 9.04638\tbleu: 11.47610\tLR: 0.00030000\t*\r\n", "Steps: 8000\tLoss: 50059.23047\tPPL: 8.89948\tbleu: 12.05842\tLR: 0.00030000\t*\r\n", "Steps: 9000\tLoss: 50120.64844\tPPL: 8.92338\tbleu: 12.70072\tLR: 0.00030000\t\r\n", "Steps: 10000\tLoss: 49895.84375\tPPL: 8.83621\tbleu: 12.57659\tLR: 0.00030000\t*\r\n", "Steps: 11000\tLoss: 50398.80078\tPPL: 9.03242\tbleu: 12.10177\tLR: 0.00030000\t\r\n", "Steps: 12000\tLoss: 50478.47656\tPPL: 9.06391\tbleu: 12.89334\tLR: 0.00030000\t\r\n", "Steps: 13000\tLoss: 51401.37500\tPPL: 9.43665\tbleu: 12.72072\tLR: 0.00030000\t\r\n", "Steps: 14000\tLoss: 51182.58594\tPPL: 9.34692\tbleu: 13.24285\tLR: 0.00030000\t\r\n", "Steps: 15000\tLoss: 51892.37500\tPPL: 9.64117\tbleu: 13.67947\tLR: 0.00030000\t\r\n", "Steps: 16000\tLoss: 52236.70312\tPPL: 9.78723\tbleu: 13.59716\tLR: 0.00021000\t\r\n", "Steps: 17000\tLoss: 52756.69531\tPPL: 10.01201\tbleu: 13.62572\tLR: 0.00021000\t\r\n", "Steps: 18000\tLoss: 53123.71094\tPPL: 10.17377\tbleu: 13.38892\tLR: 0.00021000\t\r\n", "Steps: 19000\tLoss: 53380.06250\tPPL: 10.28830\tbleu: 13.57196\tLR: 0.00021000\t\r\n", "Steps: 20000\tLoss: 53823.97266\tPPL: 10.48968\tbleu: 13.91267\tLR: 0.00021000\t\r\n", "Steps: 21000\tLoss: 54286.39453\tPPL: 10.70365\tbleu: 13.27128\tLR: 0.00021000\t\r\n", "Steps: 22000\tLoss: 54675.63672\tPPL: 10.88714\tbleu: 13.92375\tLR: 0.00014700\t\r\n", "Steps: 23000\tLoss: 54995.64453\tPPL: 11.04035\tbleu: 14.00812\tLR: 0.00014700\t\r\n", "Steps: 24000\tLoss: 55179.32031\tPPL: 11.12926\tbleu: 13.90540\tLR: 0.00014700\t\r\n", "Steps: 25000\tLoss: 55375.51953\tPPL: 11.22502\tbleu: 14.03157\tLR: 0.00014700\t\r\n", "Steps: 26000\tLoss: 56022.52734\tPPL: 11.54669\tbleu: 13.91338\tLR: 0.00014700\t\r\n", "Steps: 27000\tLoss: 56223.98438\tPPL: 11.64872\tbleu: 13.99536\tLR: 0.00014700\t\r\n", "Steps: 28000\tLoss: 56479.97266\tPPL: 11.77966\tbleu: 13.98832\tLR: 0.00010290\t\r\n", "Steps: 29000\tLoss: 56550.98828\tPPL: 11.81625\tbleu: 14.11639\tLR: 0.00010290\t\r\n", "Steps: 30000\tLoss: 56852.46875\tPPL: 11.97284\tbleu: 13.81625\tLR: 0.00010290\t\r\n", "Steps: 31000\tLoss: 57068.28906\tPPL: 12.08621\tbleu: 13.60424\tLR: 0.00010290\t\r\n", "Steps: 32000\tLoss: 57246.62109\tPPL: 12.18070\tbleu: 13.84147\tLR: 0.00010290\t\r\n", "Steps: 33000\tLoss: 57541.07031\tPPL: 12.33833\tbleu: 14.17019\tLR: 0.00010290\t\r\n", "Steps: 34000\tLoss: 57767.61719\tPPL: 12.46099\tbleu: 14.23119\tLR: 0.00007203\t\r\n", "Steps: 35000\tLoss: 57825.05469\tPPL: 12.49229\tbleu: 13.97180\tLR: 0.00007203\t\r\n", "Steps: 36000\tLoss: 57961.26562\tPPL: 12.56682\tbleu: 13.99282\tLR: 0.00007203\t\r\n", "Steps: 37000\tLoss: 58058.89844\tPPL: 12.62051\tbleu: 13.90346\tLR: 0.00007203\t\r\n", "Steps: 38000\tLoss: 58165.64844\tPPL: 12.67948\tbleu: 13.88153\tLR: 0.00007203\t\r\n", "Steps: 39000\tLoss: 58317.53906\tPPL: 12.76386\tbleu: 14.04841\tLR: 0.00007203\t\r\n", "Steps: 40000\tLoss: 58420.38281\tPPL: 12.82131\tbleu: 14.22861\tLR: 0.00005042\t\r\n", "Steps: 41000\tLoss: 58521.47266\tPPL: 12.87803\tbleu: 13.83987\tLR: 0.00005042\t\r\n", "Steps: 42000\tLoss: 58654.34375\tPPL: 12.95297\tbleu: 13.85469\tLR: 0.00005042\t\r\n", "Steps: 43000\tLoss: 58695.11719\tPPL: 12.97605\tbleu: 14.19833\tLR: 0.00005042\t\r\n" ] } ], "source": [ "# Output our validation accuracy\n", "! cat \"$experiment_path/models/${src}${tgt}_transformer/validations.txt\"" ] }, { "cell_type": "code", "execution_count": 27, "metadata": { "colab": {}, "colab_type": "code", "id": "66WhRE9lIhoD" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "2020-02-09 14:43:30,786 Hello! This is Joey-NMT.\n", "2020-02-09 14:43:49,425 dev bleu: 12.78 [Beam search decoding with beam size = 5 and alpha = 1.0]\n", "2020-02-09 14:44:12,850 test bleu: 24.29 [Beam search decoding with beam size = 5 and alpha = 1.0]\n" ] } ], "source": [ "# Test our model\n", "! cd joeynmt; python3 -m joeynmt test configs/transformer_$src$tgt.yaml" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] } ], "metadata": { "accelerator": "GPU", "colab": { "collapsed_sections": [], "name": "starter_notebook.ipynb", "provenance": [], "toc_visible": true }, "kernelspec": { "display_name": "conda_python3", "language": "python", "name": "conda_python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.6.5" } }, "nbformat": 4, "nbformat_minor": 1 }