File size: 236,524 Bytes
78aa4ee
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
1079
1080
1081
1082
1083
1084
1085
1086
1087
1088
1089
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
1122
1123
1124
1125
1126
1127
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
1138
1139
1140
1141
1142
1143
1144
1145
1146
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
1201
1202
1203
1204
1205
1206
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
1228
1229
1230
1231
1232
1233
1234
1235
1236
1237
1238
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
1266
1267
1268
1269
1270
1271
1272
1273
1274
1275
1276
1277
1278
1279
1280
1281
1282
1283
1284
1285
1286
1287
1288
1289
1290
1291
1292
1293
1294
1295
1296
1297
1298
1299
1300
1301
1302
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
1327
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
1361
1362
1363
1364
1365
1366
1367
1368
1369
1370
1371
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
1425
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
1457
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
1476
1477
1478
1479
1480
1481
1482
1483
1484
1485
1486
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
1553
1554
1555
1556
1557
1558
1559
1560
1561
1562
1563
1564
1565
1566
1567
1568
1569
1570
1571
1572
1573
1574
1575
1576
1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
1587
1588
1589
1590
1591
1592
1593
1594
1595
1596
1597
1598
1599
1600
1601
1602
1603
1604
1605
1606
1607
1608
1609
1610
1611
1612
1613
1614
1615
1616
1617
1618
1619
1620
1621
1622
1623
1624
1625
1626
1627
1628
1629
1630
1631
1632
1633
1634
1635
1636
1637
1638
1639
1640
1641
1642
1643
1644
1645
1646
1647
1648
1649
1650
1651
1652
1653
1654
1655
1656
1657
1658
1659
1660
1661
1662
1663
1664
1665
1666
1667
1668
1669
1670
1671
1672
1673
1674
1675
1676
1677
1678
1679
1680
1681
1682
1683
1684
1685
1686
1687
1688
1689
1690
1691
1692
1693
1694
1695
1696
1697
1698
1699
1700
1701
1702
1703
1704
1705
1706
1707
1708
1709
1710
1711
1712
1713
1714
1715
1716
1717
1718
1719
1720
1721
1722
1723
1724
1725
1726
1727
1728
1729
1730
1731
1732
1733
1734
1735
1736
1737
1738
1739
1740
1741
1742
1743
1744
1745
1746
1747
1748
1749
1750
1751
1752
1753
1754
1755
1756
1757
1758
1759
1760
1761
1762
1763
1764
1765
1766
1767
1768
1769
1770
1771
1772
1773
1774
1775
1776
1777
1778
1779
1780
1781
1782
1783
1784
1785
1786
1787
1788
1789
1790
1791
1792
1793
1794
1795
1796
1797
1798
1799
1800
1801
1802
1803
1804
1805
1806
1807
1808
1809
1810
1811
1812
1813
1814
1815
1816
1817
1818
1819
1820
1821
1822
1823
1824
1825
1826
1827
1828
1829
1830
1831
1832
1833
1834
1835
1836
1837
1838
1839
1840
1841
1842
1843
1844
1845
1846
1847
1848
1849
1850
1851
1852
1853
1854
1855
1856
1857
1858
1859
1860
1861
1862
1863
1864
1865
1866
1867
1868
1869
1870
1871
1872
1873
1874
1875
1876
1877
1878
1879
1880
1881
1882
1883
1884
1885
1886
1887
1888
1889
1890
1891
1892
1893
1894
1895
1896
1897
1898
1899
1900
1901
1902
1903
1904
1905
1906
1907
1908
1909
1910
1911
1912
1913
1914
1915
1916
1917
1918
1919
1920
1921
1922
1923
1924
1925
1926
1927
1928
1929
1930
1931
1932
1933
1934
1935
1936
1937
1938
1939
1940
1941
1942
1943
1944
1945
1946
1947
1948
1949
1950
1951
1952
1953
1954
1955
1956
1957
1958
1959
1960
1961
1962
1963
1964
1965
1966
1967
1968
1969
1970
1971
1972
1973
1974
1975
1976
1977
1978
1979
1980
1981
1982
1983
1984
1985
1986
1987
1988
1989
1990
1991
1992
1993
1994
1995
1996
1997
1998
1999
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
2025
2026
2027
2028
2029
2030
2031
2032
2033
2034
2035
2036
2037
2038
2039
2040
2041
2042
2043
2044
2045
2046
2047
2048
2049
2050
2051
2052
2053
2054
2055
2056
2057
2058
2059
2060
2061
2062
2063
2064
2065
2066
2067
2068
2069
2070
2071
2072
2073
2074
2075
2076
2077
2078
2079
2080
2081
2082
2083
2084
2085
2086
2087
2088
2089
2090
2091
2092
2093
2094
2095
2096
2097
2098
2099
2100
2101
2102
2103
2104
2105
2106
2107
2108
2109
2110
2111
2112
2113
2114
2115
2116
2117
2118
2119
2120
2121
2122
2123
2124
2125
2126
2127
2128
2129
2130
2131
2132
2133
2134
2135
2136
2137
2138
2139
2140
2141
2142
2143
2144
2145
2146
2147
2148
2149
2150
2151
2152
2153
2154
2155
2156
2157
2158
2159
2160
2161
2162
2163
2164
2165
2166
2167
2168
2169
2170
2171
2172
2173
2174
2175
2176
2177
2178
2179
2180
2181
2182
2183
2184
2185
2186
2187
2188
2189
2190
2191
2192
2193
2194
2195
2196
2197
2198
2199
2200
2201
2202
2203
2204
2205
2206
2207
2208
2209
2210
2211
2212
2213
2214
2215
2216
2217
2218
2219
2220
2221
2222
2223
2224
2225
2226
2227
2228
2229
2230
2231
2232
2233
2234
2235
2236
2237
2238
2239
2240
2241
2242
2243
2244
2245
2246
2247
2248
2249
2250
2251
2252
2253
2254
2255
2256
2257
2258
2259
2260
2261
2262
2263
2264
2265
2266
2267
2268
2269
2270
2271
2272
2273
2274
2275
2276
2277
2278
2279
2280
2281
2282
2283
2284
2285
2286
2287
2288
2289
2290
2291
2292
2293
2294
2295
2296
2297
2298
2299
2300
2301
2302
2303
2304
2305
2306
2307
2308
2309
2310
2311
2312
2313
2314
2315
2316
2317
2318
2319
2320
2321
2322
2323
2324
2325
2326
2327
2328
2329
2330
2331
2332
2333
2334
2335
2336
2337
2338
2339
2340
2341
2342
2343
2344
2345
2346
2347
2348
2349
2350
2351
2352
2353
2354
2355
2356
2357
2358
2359
2360
2361
2362
2363
2364
2365
2366
2367
2368
2369
2370
2371
2372
2373
2374
2375
2376
2377
2378
2379
2380
2381
2382
2383
2384
2385
2386
2387
2388
2389
2390
2391
2392
2393
2394
2395
2396
2397
2398
2399
2400
2401
2402
2403
2404
2405
2406
2407
2408
2409
2410
2411
2412
2413
2414
2415
2416
2417
2418
2419
2420
2421
2422
2423
2424
2425
2426
2427
2428
2429
2430
2431
2432
2433
2434
2435
2436
2437
2438
2439
2440
2441
2442
2443
2444
2445
2446
2447
2448
2449
2450
2451
2452
2453
2454
2455
2456
2457
2458
2459
2460
2461
2462
2463
2464
2465
2466
2467
2468
2469
2470
2471
2472
2473
2474
2475
2476
2477
2478
2479
2480
2481
2482
2483
2484
2485
2486
2487
2488
2489
2490
2491
2492
2493
2494
2495
2496
2497
2498
2499
2500
2501
2502
2503
2504
2505
2506
2507
2508
2509
2510
2511
2512
2513
2514
2515
2516
2517
2518
2519
2520
2521
2522
2523
2524
2525
2526
2527
2528
2529
2530
2531
2532
2533
2534
2535
2536
2537
2538
2539
2540
2541
2542
2543
2544
2545
2546
2547
2548
2549
2550
2551
2552
2553
2554
2555
2556
2557
2558
2559
2560
2561
2562
2563
2564
2565
2566
2567
2568
2569
2570
2571
2572
2573
2574
2575
2576
2577
2578
2579
2580
2581
2582
2583
2584
2585
2586
2587
2588
2589
2590
2591
2592
2593
2594
2595
2596
2597
2598
2599
2600
2601
2602
2603
2604
2605
2606
2607
2608
2609
2610
2611
2612
2613
2614
2615
2616
2617
2618
2619
2620
2621
2622
2623
2624
2625
2626
2627
2628
2629
2630
2631
2632
2633
2634
2635
2636
2637
2638
2639
2640
2641
2642
2643
2644
2645
2646
2647
2648
2649
2650
2651
2652
2653
2654
2655
2656
2657
2658
2659
2660
2661
2662
2663
2664
2665
2666
2667
2668
2669
2670
2671
2672
2673
2674
2675
2676
2677
2678
2679
2680
2681
2682
2683
2684
2685
2686
2687
2688
2689
2690
2691
2692
2693
2694
2695
2696
2697
2698
2699
2700
2701
2702
2703
2704
2705
2706
2707
2708
2709
2710
2711
2712
2713
2714
2715
2716
2717
2718
2719
2720
2721
2722
2723
2724
2725
2726
2727
2728
2729
2730
2731
2732
2733
2734
2735
2736
2737
2738
2739
2740
2741
2742
2743
2744
2745
2746
2747
2748
2749
2750
2751
2752
2753
2754
2755
2756
2757
2758
2759
2760
2761
2762
2763
2764
2765
2766
2767
2768
2769
2770
2771
2772
2773
2774
2775
2776
2777
2778
2779
2780
2781
2782
2783
2784
2785
2786
2787
2788
2789
2790
2791
2792
2793
2794
2795
2796
2797
2798
2799
2800
2801
2802
2803
2804
2805
2806
2807
2808
2809
2810
2811
2812
2813
2814
2815
2816
2817
2818
2819
2820
2821
2822
2823
2824
2825
2826
2827
2828
2829
2830
2831
2832
2833
2834
2835
2836
2837
2838
2839
2840
2841
2842
2843
2844
2845
2846
2847
2848
2849
2850
2851
2852
2853
2854
2855
2856
2857
2858
2859
2860
2861
2862
2863
2864
2865
2866
2867
2868
2869
2870
2871
2872
2873
2874
2875
2876
{
  "nbformat": 4,
  "nbformat_minor": 0,
  "metadata": {
    "accelerator": "GPU",
    "colab": {
      "name": "starter_notebook.ipynb",
      "provenance": [],
      "collapsed_sections": [],
      "toc_visible": true
    },
    "kernelspec": {
      "display_name": "Python 3",
      "language": "python",
      "name": "python3"
    },
    "language_info": {
      "codemirror_mode": {
        "name": "ipython",
        "version": 3
      },
      "file_extension": ".py",
      "mimetype": "text/x-python",
      "name": "python",
      "nbconvert_exporter": "python",
      "pygments_lexer": "ipython3",
      "version": "3.5.6"
    }
  },
  "cells": [
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "Igc5itf-xMGj"
      },
      "source": [
        "# Masakhane - Machine Translation for African Languages (Using JoeyNMT)"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "x4fXCKCf36IK"
      },
      "source": [
        "## Note before beginning:\n",
        "### - The idea is that you should be able to make minimal changes to this in order to get SOME result for your own translation corpus. \n",
        "\n",
        "### - The tl;dr: Go to the **\"TODO\"** comments which will tell you what to update to get up and running\n",
        "\n",
        "### - If you actually want to have a clue what you're doing, read the text and peek at the links\n",
        "\n",
        "### - With 100 epochs, it should take around 7 hours to run in Google Colab\n",
        "\n",
        "### - Once you've gotten a result for your language, please attach and email your notebook that generated it to masakhanetranslation@gmail.com\n",
        "\n",
        "### - If you care enough and get a chance, doing a brief background on your language would be amazing. See examples in  [(Martinus, 2019)](https://arxiv.org/abs/1906.05685)"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "l929HimrxS0a"
      },
      "source": [
        "## Retrieve your data & make a parallel corpus\n",
        "\n",
        "If you are wanting to use the JW300 data referenced on the Masakhane website or in our GitHub repo, you can use `opus-tools` to convert the data into a convenient format. `opus_read` from that package provides a convenient tool for reading the native aligned XML files and to convert them to TMX format. The tool can also be used to fetch relevant files from OPUS on the fly and to filter the data as necessary. [Read the documentation](https://pypi.org/project/opustools-pkg/) for more details.\n",
        "\n",
        "Once you have your corpus files in TMX format (an xml structure which will include the sentences in your target language and your source language in a single file), we recommend reading them into a pandas dataframe. Thankfully, Jade wrote a silly `tmx2dataframe` package which converts your tmx file to a pandas dataframe. "
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "oGRmDELn7Az0",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 34
        },
        "outputId": "b8270ce4-6e57-4ce8-e7d9-46d20290dd69"
      },
      "source": [
        "from google.colab import drive\n",
        "drive.mount('/content/drive')"
      ],
      "execution_count": 26,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Drive already mounted at /content/drive; to attempt to forcibly remount, call drive.mount(\"/content/drive\", force_remount=True).\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "Cn3tgQLzUxwn",
        "colab": {}
      },
      "source": [
        "# TODO: Set your source and target languages. Keep in mind, these traditionally use language codes as found here:\n",
        "# These will also become the suffix's of all vocab and corpus files used throughout\n",
        "import os\n",
        "source_language = \"en\"\n",
        "target_language = \"urh\" \n",
        "lc = False  # If True, lowercase the data.\n",
        "seed = 42  # Random seed for shuffling.\n",
        "tag = \"baseline\" # Give a unique name to your folder - this is to ensure you don't rewrite any models you've already submitted\n",
        "\n",
        "os.environ[\"src\"] = source_language # Sets them in bash as well, since we often use bash scripts\n",
        "os.environ[\"tgt\"] = target_language\n",
        "os.environ[\"tag\"] = tag\n",
        "\n",
        "# This will save it to a folder in our gdrive instead!\n",
        "!mkdir -p \"/content/drive/My Drive/masakhane/$src-$tgt-$tag\"\n",
        "os.environ[\"gdrive_path\"] = \"/content/drive/My Drive/masakhane/%s-%s-%s\" % (source_language, target_language, tag)"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "kBSgJHEw7Nvx",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 34
        },
        "outputId": "d786cba5-3631-4959-8875-c15ef993c21e"
      },
      "source": [
        "!echo $gdrive_path"
      ],
      "execution_count": 28,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "/content/drive/My Drive/masakhane/en-urh-baseline\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "gA75Fs9ys8Y9",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 34
        },
        "outputId": "04370262-f35c-4d29-b251-41d182537b7f"
      },
      "source": [
        "# Install opus-tools\n",
        "! pip install opustools-pkg"
      ],
      "execution_count": 29,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Requirement already satisfied: opustools-pkg in /usr/local/lib/python3.6/dist-packages (0.0.52)\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "xq-tDZVks7ZD",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 187
        },
        "outputId": "b62cc8ce-a08b-494f-ea6d-d262f7d2c6f8"
      },
      "source": [
        "# Downloading our corpus\n",
        "! opus_read -d JW300 -s $src -t $tgt -wm moses -w jw300.$src jw300.$tgt -q\n",
        "\n",
        "# extract the corpus file\n",
        "! gunzip JW300_latest_xml_$src-$tgt.xml.gz"
      ],
      "execution_count": 30,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "\n",
            "Alignment file /proj/nlpl/data/OPUS/JW300/latest/xml/en-urh.xml.gz not found. The following files are available for downloading:\n",
            "\n",
            "        ./JW300_latest_xml_en.zip already exists\n",
            "        ./JW300_latest_xml_urh.zip already exists\n",
            " 304 KB https://object.pouta.csc.fi/OPUS-JW300/v1/xml/en-urh.xml.gz\n",
            "\n",
            " 304 KB Total size\n",
            "./JW300_latest_xml_en-urh.xml.gz ... 100% of 304 KB\n",
            "gzip: JW300_latest_xml_en-urh.xml already exists; do you wish to overwrite (y or n)? y\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "n48GDRnP8y2G",
        "colab_type": "code",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 578
        },
        "outputId": "b50339aa-aae7-4027-a985-f303c50b290b"
      },
      "source": [
        "# Download the global test set.\n",
        "! wget https://raw.githubusercontent.com/juliakreutzer/masakhane/master/jw300_utils/test/test.en-any.en\n",
        "  \n",
        "# And the specific test set for this language pair.\n",
        "os.environ[\"trg\"] = target_language \n",
        "os.environ[\"src\"] = source_language \n",
        "\n",
        "! wget https://raw.githubusercontent.com/juliakreutzer/masakhane/master/jw300_utils/test/test.en-$trg.en \n",
        "! mv test.en-$trg.en test.en\n",
        "! wget https://raw.githubusercontent.com/juliakreutzer/masakhane/master/jw300_utils/test/test.en-$trg.$trg \n",
        "! mv test.en-$trg.$trg test.$trg"
      ],
      "execution_count": 31,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "--2019-12-31 06:41:57--  https://raw.githubusercontent.com/juliakreutzer/masakhane/master/jw300_utils/test/test.en-any.en\n",
            "Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 151.101.0.133, 151.101.64.133, 151.101.128.133, ...\n",
            "Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|151.101.0.133|:443... connected.\n",
            "HTTP request sent, awaiting response... 200 OK\n",
            "Length: 277791 (271K) [text/plain]\n",
            "Saving to: ‘test.en-any.en.1’\n",
            "\n",
            "\rtest.en-any.en.1      0%[                    ]       0  --.-KB/s               \rtest.en-any.en.1    100%[===================>] 271.28K  --.-KB/s    in 0.02s   \n",
            "\n",
            "2019-12-31 06:41:57 (16.5 MB/s) - ‘test.en-any.en.1’ saved [277791/277791]\n",
            "\n",
            "--2019-12-31 06:41:59--  https://raw.githubusercontent.com/juliakreutzer/masakhane/master/jw300_utils/test/test.en-urh.en\n",
            "Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 151.101.0.133, 151.101.64.133, 151.101.128.133, ...\n",
            "Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|151.101.0.133|:443... connected.\n",
            "HTTP request sent, awaiting response... 200 OK\n",
            "Length: 201504 (197K) [text/plain]\n",
            "Saving to: ‘test.en-urh.en’\n",
            "\n",
            "test.en-urh.en      100%[===================>] 196.78K  --.-KB/s    in 0.01s   \n",
            "\n",
            "2019-12-31 06:41:59 (13.8 MB/s) - ‘test.en-urh.en’ saved [201504/201504]\n",
            "\n",
            "--2019-12-31 06:42:02--  https://raw.githubusercontent.com/juliakreutzer/masakhane/master/jw300_utils/test/test.en-urh.urh\n",
            "Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 151.101.0.133, 151.101.64.133, 151.101.128.133, ...\n",
            "Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|151.101.0.133|:443... connected.\n",
            "HTTP request sent, awaiting response... 200 OK\n",
            "Length: 236859 (231K) [text/plain]\n",
            "Saving to: ‘test.en-urh.urh’\n",
            "\n",
            "test.en-urh.urh     100%[===================>] 231.31K  --.-KB/s    in 0.02s   \n",
            "\n",
            "2019-12-31 06:42:03 (14.5 MB/s) - ‘test.en-urh.urh’ saved [236859/236859]\n",
            "\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "NqDG-CI28y2L",
        "colab_type": "code",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 34
        },
        "outputId": "d11c61cb-d165-4fcd-cb20-92015d712ce5"
      },
      "source": [
        "# Read the test data to filter from train and dev splits.\n",
        "# Store english portion in set for quick filtering checks.\n",
        "en_test_sents = set()\n",
        "filter_test_sents = \"test.en-any.en\"\n",
        "j = 0\n",
        "with open(filter_test_sents) as f:\n",
        "  for line in f:\n",
        "    en_test_sents.add(line.strip())\n",
        "    j += 1\n",
        "print('Loaded {} global test sentences to filter from the training/dev data.'.format(j))"
      ],
      "execution_count": 32,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Loaded 3571 global test sentences to filter from the training/dev data.\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "3CNdwLBCfSIl",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 159
        },
        "outputId": "077bb76e-99f2-49e4-f539-1cd7a206eefa"
      },
      "source": [
        "import pandas as pd\n",
        "\n",
        "# TMX file to dataframe\n",
        "source_file = 'jw300.' + source_language\n",
        "target_file = 'jw300.' + target_language\n",
        "\n",
        "source = []\n",
        "target = []\n",
        "skip_lines = []  # Collect the line numbers of the source portion to skip the same lines for the target portion.\n",
        "with open(source_file) as f:\n",
        "    for i, line in enumerate(f):\n",
        "        # Skip sentences that are contained in the test set.\n",
        "        if line.strip() not in en_test_sents:\n",
        "            source.append(line.strip())\n",
        "        else:\n",
        "            skip_lines.append(i)             \n",
        "with open(target_file) as f:\n",
        "    for j, line in enumerate(f):\n",
        "        # Only add to corpus if corresponding source was not skipped.\n",
        "        if j not in skip_lines:\n",
        "            target.append(line.strip())\n",
        "    \n",
        "print('Loaded data and skipped {}/{} lines since contained in test set.'.format(len(skip_lines), i))\n",
        "    \n",
        "df = pd.DataFrame(zip(source, target), columns=['source_sentence', 'target_sentence'])\n",
        "# if you get TypeError: data argument can't be an iterator is because of your zip version run this below\n",
        "#df = pd.DataFrame(list(zip(source, target)), columns=['source_sentence', 'target_sentence'])\n",
        "df.head(3)"
      ],
      "execution_count": 33,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Loaded data and skipped 4050/32709 lines since contained in test set.\n"
          ],
          "name": "stdout"
        },
        {
          "output_type": "execute_result",
          "data": {
            "text/html": [
              "<div>\n",
              "<style scoped>\n",
              "    .dataframe tbody tr th:only-of-type {\n",
              "        vertical-align: middle;\n",
              "    }\n",
              "\n",
              "    .dataframe tbody tr th {\n",
              "        vertical-align: top;\n",
              "    }\n",
              "\n",
              "    .dataframe thead th {\n",
              "        text-align: right;\n",
              "    }\n",
              "</style>\n",
              "<table border=\"1\" class=\"dataframe\">\n",
              "  <thead>\n",
              "    <tr style=\"text-align: right;\">\n",
              "      <th></th>\n",
              "      <th>source_sentence</th>\n",
              "      <th>target_sentence</th>\n",
              "    </tr>\n",
              "  </thead>\n",
              "  <tbody>\n",
              "    <tr>\n",
              "      <th>0</th>\n",
              "      <td>Why It Pays to Be Honest 6</td>\n",
              "      <td>Erere Herọ Ra Vwọ Dia Ohwo rẹ Uyota 5</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>1</th>\n",
              "      <td>The Bible Changes Lives</td>\n",
              "      <td>7 Ovwan “ Jẹn Ẹguọnọ rẹ Iniọvo na Dje Ebuoebuo...</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2</th>\n",
              "      <td>Give Me Just One Year of Peace and Happiness 8...</td>\n",
              "      <td>12 Jẹ ‘ Ẹse rẹ Ọghẹnẹ rẹ Unu se Gbe - e na , ’...</td>\n",
              "    </tr>\n",
              "  </tbody>\n",
              "</table>\n",
              "</div>"
            ],
            "text/plain": [
              "                                     source_sentence                                    target_sentence\n",
              "0                         Why It Pays to Be Honest 6              Erere Herọ Ra Vwọ Dia Ohwo rẹ Uyota 5\n",
              "1                            The Bible Changes Lives  7 Ovwan “ Jẹn Ẹguọnọ rẹ Iniọvo na Dje Ebuoebuo...\n",
              "2  Give Me Just One Year of Peace and Happiness 8...  12 Jẹ ‘ Ẹse rẹ Ọghẹnẹ rẹ Unu se Gbe - e na , ’..."
            ]
          },
          "metadata": {
            "tags": []
          },
          "execution_count": 33
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "YkuK3B4p2AkN"
      },
      "source": [
        "## Pre-processing and export\n",
        "\n",
        "It is generally a good idea to remove duplicate translations and conflicting translations from the corpus. In practice, these public corpora include some number of these that need to be cleaned.\n",
        "\n",
        "In addition we will split our data into dev/test/train and export to the filesystem."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "M_2ouEOH1_1q",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 187
        },
        "outputId": "72a20551-8b78-4ce6-f227-0a5049476674"
      },
      "source": [
        "# drop duplicate translations\n",
        "df_pp = df.drop_duplicates()\n",
        "\n",
        "# drop conflicting translations\n",
        "# (this is optional and something that you might want to comment out \n",
        "# depending on the size of your corpus)\n",
        "df_pp.drop_duplicates(subset='source_sentence', inplace=True)\n",
        "df_pp.drop_duplicates(subset='target_sentence', inplace=True)\n",
        "\n",
        "# Shuffle the data to remove bias in dev set selection.\n",
        "df_pp = df_pp.sample(frac=1, random_state=seed).reset_index(drop=True)"
      ],
      "execution_count": 34,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "/usr/local/lib/python3.6/dist-packages/ipykernel_launcher.py:6: SettingWithCopyWarning: \n",
            "A value is trying to be set on a copy of a slice from a DataFrame\n",
            "\n",
            "See the caveats in the documentation: http://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\n",
            "  \n",
            "/usr/local/lib/python3.6/dist-packages/ipykernel_launcher.py:7: SettingWithCopyWarning: \n",
            "A value is trying to be set on a copy of a slice from a DataFrame\n",
            "\n",
            "See the caveats in the documentation: http://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\n",
            "  import sys\n"
          ],
          "name": "stderr"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "Z_1BwAApEtMk",
        "colab_type": "code",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 561
        },
        "outputId": "ae7e81b6-ebc5-47d3-c10d-a9dd2b2ced41"
      },
      "source": [
        "# Install fuzzy wuzzy to remove \"almost duplicate\" sentences in the\n",
        "# test and training sets.\n",
        "! pip install fuzzywuzzy\n",
        "! pip install python-Levenshtein\n",
        "import time\n",
        "from fuzzywuzzy import process\n",
        "import numpy as np\n",
        "\n",
        "# reset the index of the training set after previous filtering\n",
        "df_pp.reset_index(drop=False, inplace=True)\n",
        "\n",
        "# Remove samples from the training data set if they \"almost overlap\" with the\n",
        "# samples in the test set.\n",
        "\n",
        "# Filtering function. Adjust pad to narrow down the candidate matches to\n",
        "# within a certain length of characters of the given sample.\n",
        "def fuzzfilter(sample, candidates, pad):\n",
        "  candidates = [x for x in candidates if len(x) <= len(sample)+pad and len(x) >= len(sample)-pad] \n",
        "  if len(candidates) > 0:\n",
        "    return process.extractOne(sample, candidates)[1]\n",
        "  else:\n",
        "    return np.nan\n",
        "\n",
        "# NOTE - This might run slow depending on the size of your training set. We are\n",
        "# printing some information to help you track how long it would take. \n",
        "scores = []\n",
        "start_time = time.time()\n",
        "for idx, row in df_pp.iterrows():\n",
        "  scores.append(fuzzfilter(row['source_sentence'], list(en_test_sents), 5))\n",
        "  if idx % 1000 == 0:\n",
        "    hours, rem = divmod(time.time() - start_time, 3600)\n",
        "    minutes, seconds = divmod(rem, 60)\n",
        "    print(\"{:0>2}:{:0>2}:{:05.2f}\".format(int(hours),int(minutes),seconds), \"%0.2f percent complete\" % (100.0*float(idx)/float(len(df_pp))))\n",
        "\n",
        "# Filter out \"almost overlapping samples\"\n",
        "df_pp['scores'] = scores\n",
        "df_pp = df_pp[df_pp['scores'] < 95]"
      ],
      "execution_count": 35,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Requirement already satisfied: fuzzywuzzy in /usr/local/lib/python3.6/dist-packages (0.17.0)\n",
            "Requirement already satisfied: python-Levenshtein in /usr/local/lib/python3.6/dist-packages (0.12.0)\n",
            "Requirement already satisfied: setuptools in /usr/local/lib/python3.6/dist-packages (from python-Levenshtein) (42.0.2)\n",
            "00:00:00.04 0.00 percent complete\n",
            "00:00:21.42 3.74 percent complete\n",
            "00:00:42.25 7.49 percent complete\n",
            "00:01:03.74 11.23 percent complete\n",
            "00:01:26.69 14.98 percent complete\n",
            "00:01:47.64 18.72 percent complete\n",
            "00:02:08.31 22.46 percent complete\n",
            "00:02:28.50 26.21 percent complete\n",
            "00:02:48.45 29.95 percent complete\n",
            "00:03:08.71 33.70 percent complete\n",
            "00:03:28.66 37.44 percent complete\n",
            "00:03:49.30 41.18 percent complete\n",
            "00:04:09.78 44.93 percent complete\n",
            "00:04:29.06 48.67 percent complete\n",
            "00:04:48.60 52.41 percent complete\n",
            "00:05:08.53 56.16 percent complete\n",
            "00:05:29.01 59.90 percent complete\n"
          ],
          "name": "stdout"
        },
        {
          "output_type": "stream",
          "text": [
            "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '']\n"
          ],
          "name": "stderr"
        },
        {
          "output_type": "stream",
          "text": [
            "00:05:48.90 63.65 percent complete\n",
            "00:06:08.27 67.39 percent complete\n",
            "00:06:28.27 71.13 percent complete\n",
            "00:06:49.04 74.88 percent complete\n",
            "00:07:08.41 78.62 percent complete\n",
            "00:07:28.22 82.37 percent complete\n",
            "00:07:47.22 86.11 percent complete\n"
          ],
          "name": "stdout"
        },
        {
          "output_type": "stream",
          "text": [
            "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '*']\n"
          ],
          "name": "stderr"
        },
        {
          "output_type": "stream",
          "text": [
            "00:08:06.79 89.85 percent complete\n",
            "00:08:26.41 93.60 percent complete\n",
            "00:08:46.27 97.34 percent complete\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "hxxBOCA-xXhy",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 1000
        },
        "outputId": "52e8e2ea-c8f5-4eff-958f-d185ca390e8e"
      },
      "source": [
        "# This section does the split between train/dev for the parallel corpora then saves them as separate files\n",
        "# We use 1000 dev test and the given test set.\n",
        "import csv\n",
        "\n",
        "# Do the split between dev/train and create parallel corpora\n",
        "num_dev_patterns = 1000\n",
        "\n",
        "# Optional: lower case the corpora - this will make it easier to generalize, but without proper casing.\n",
        "if lc:  # Julia: making lowercasing optional\n",
        "    df_pp[\"source_sentence\"] = df_pp[\"source_sentence\"].str.lower()\n",
        "    df_pp[\"target_sentence\"] = df_pp[\"target_sentence\"].str.lower()\n",
        "\n",
        "# Julia: test sets are already generated\n",
        "dev = df_pp.tail(num_dev_patterns) # Herman: Error in original\n",
        "stripped = df_pp.drop(df_pp.tail(num_dev_patterns).index)\n",
        "\n",
        "with open(\"train.\"+source_language, \"w\") as src_file, open(\"train.\"+target_language, \"w\") as trg_file:\n",
        "  for index, row in stripped.iterrows():\n",
        "    src_file.write(row[\"source_sentence\"]+\"\\n\")\n",
        "    trg_file.write(row[\"target_sentence\"]+\"\\n\")\n",
        "    \n",
        "with open(\"dev.\"+source_language, \"w\") as src_file, open(\"dev.\"+target_language, \"w\") as trg_file:\n",
        "  for index, row in dev.iterrows():\n",
        "    src_file.write(row[\"source_sentence\"]+\"\\n\")\n",
        "    trg_file.write(row[\"target_sentence\"]+\"\\n\")\n",
        "\n",
        "#stripped[[\"source_sentence\"]].to_csv(\"train.\"+source_language, header=False, index=False)  # Herman: Added `header=False` everywhere\n",
        "#stripped[[\"target_sentence\"]].to_csv(\"train.\"+target_language, header=False, index=False)  # Julia: Problematic handling of quotation marks.\n",
        "\n",
        "#dev[[\"source_sentence\"]].to_csv(\"dev.\"+source_language, header=False, index=False)\n",
        "#dev[[\"target_sentence\"]].to_csv(\"dev.\"+target_language, header=False, index=False)\n",
        "\n",
        "# Doublecheck the format below. There should be no extra quotation marks or weird characters.\n",
        "! head train.*\n",
        "! head dev.*"
      ],
      "execution_count": 36,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "==> train.bpe.en <==\n",
            "The number of publishers is now about ten times what it was when I began serving here .\n",
            "Sim@@ il@@ arly , elders should not only encourage and con@@ so@@ le their brothers with words but also build them up by showing sin@@ c@@ ere personal interest . ​ — 1 Cor .\n",
            "17 Why We “ Keep B@@ earing M@@ u@@ ch F@@ ru@@ it ”\n",
            "Now I have a j@@ our@@ n@@ al that I keep on my des@@ k to s@@ che@@ du@@ le up@@ coming work , and this helps me to s@@ che@@ du@@ le my@@ self , not lea@@ ving things till the last min@@ ute . ”\n",
            "1 , 2 . ( a ) How do some react to the thought that God has an organization ?\n",
            "We cannot go to the point of dis@@ obey@@ ing God or viol@@ ating our Christian ne@@ u@@ tr@@ ality . ​ — Read 1 Peter 2 : 13 - 17 .\n",
            "Did this mean freedom for every liter@@ al slave ?\n",
            "E@@ ven@@ tually , all my si@@ bl@@ ings did so and became Jehovah’s Witnesses .\n",
            "How plea@@ sed Jehovah will be as he ob@@ ser@@ ves our whole - s@@ ou@@ led efforts to “ keep b@@ earing much fruit ” !\n",
            "Joseph , though , was a disci@@ ple , but he could not bring himself to say so op@@ en@@ ly .\n",
            "\n",
            "==> train.bpe.urh <==\n",
            "I@@ ghwoghwota rehẹ ẹkuotọ na enẹna vwẹ ọh@@ wọ@@ h@@ wọ ihwe vwo bun vrẹ obo rọ hepha ọke me vwọ ga vwẹ oboyin .\n",
            "( 2 Kọr . 12 : 15 ) Vwẹ idjerhe vuọvo na , vwọ vrẹ ota rẹ unu rẹ ekpako cha vwọ bọn iniọvo na gan , o ji fo nẹ ayen ru obo ro che djephia nẹ ayen vwo ọdavwẹ rayen . ​ — 1 Kọr .\n",
            "17 Obo@@ resorọ O Vwo F@@ o N@@ ẹ A “ M@@ ọ I@@ b@@ i Bu@@ eb@@ u ”\n",
            "Asaọkiephana , mi vwo ẹbe rẹ mi si ọrhuẹrẹphiyotọ mẹ phiyọ , ọnana vwẹ ukẹcha kẹ vwẹ vwọ nabọ vwẹrote iruo mẹ , me rha yan@@ jẹ ọvuọvo vwo hẹrhẹ im@@ ib@@ r@@ ẹro ri chekọ bẹ@@ siẹ ọke na vwo re - e . ”\n",
            "1 , 2 . ( a ) Die yen ihwo evo ta siẹrẹ ayen de nyo nẹ Ọghẹnẹ vwo ukoko ?\n",
            "Avwanre cha sa chu@@ rhi rẹ Ọghẹnẹ fikirẹ aye - en yẹrẹ dia ẹbẹre ọvo rẹ akpọ na - a . — Se 1 Pita 2 : 13 - 17 .\n",
            "( Luk 4 : 18 ) Ọnana mudiaphiyọ egbomọphẹ vwọ kẹ ihwo re mu kpo eviẹn ?\n",
            "Ukuotọ rọyen , iniọvo mẹ eje de yono Baibol ji bromaphiyame kerẹ Iseri rẹ Jihova .\n",
            "O muẹro dẹn nẹ oma nabọ vwerhen Jihova kọke kọke rọ da mrẹ oborẹ avwanre davw@@ an te , ra vwọ “ mọ ib@@ i bu@@ eb@@ u ” !\n",
            "Ẹkẹvuọvo , Josẹf ọyen odibo rẹ Jesu ro se dje oma phia vwẹ az@@ a@@ gba - a .\n",
            "\n",
            "==> train.en <==\n",
            "The number of publishers is now about ten times what it was when I began serving here .\n",
            "Similarly , elders should not only encourage and console their brothers with words but also build them up by showing sincere personal interest . ​ — 1 Cor .\n",
            "17 Why We “ Keep Bearing Much Fruit ”\n",
            "Now I have a journal that I keep on my desk to schedule upcoming work , and this helps me to schedule myself , not leaving things till the last minute . ”\n",
            "1 , 2 . ( a ) How do some react to the thought that God has an organization ?\n",
            "We cannot go to the point of disobeying God or violating our Christian neutrality . ​ — Read 1 Peter 2 : 13 - 17 .\n",
            "Did this mean freedom for every literal slave ?\n",
            "Eventually , all my siblings did so and became Jehovah’s Witnesses .\n",
            "How pleased Jehovah will be as he observes our whole - souled efforts to “ keep bearing much fruit ” !\n",
            "Joseph , though , was a disciple , but he could not bring himself to say so openly .\n",
            "\n",
            "==> train.urh <==\n",
            "Ighwoghwota rehẹ ẹkuotọ na enẹna vwẹ ọhwọhwọ ihwe vwo bun vrẹ obo rọ hepha ọke me vwọ ga vwẹ oboyin .\n",
            "( 2 Kọr . 12 : 15 ) Vwẹ idjerhe vuọvo na , vwọ vrẹ ota rẹ unu rẹ ekpako cha vwọ bọn iniọvo na gan , o ji fo nẹ ayen ru obo ro che djephia nẹ ayen vwo ọdavwẹ rayen . ​ — 1 Kọr .\n",
            "17 Oboresorọ O Vwo Fo Nẹ A “ Mọ Ibi Buebu ”\n",
            "Asaọkiephana , mi vwo ẹbe rẹ mi si ọrhuẹrẹphiyotọ mẹ phiyọ , ọnana vwẹ ukẹcha kẹ vwẹ vwọ nabọ vwẹrote iruo mẹ , me rha yanjẹ ọvuọvo vwo hẹrhẹ imibrẹro ri chekọ bẹsiẹ ọke na vwo re - e . ”\n",
            "1 , 2 . ( a ) Die yen ihwo evo ta siẹrẹ ayen de nyo nẹ Ọghẹnẹ vwo ukoko ?\n",
            "Avwanre cha sa churhi rẹ Ọghẹnẹ fikirẹ aye - en yẹrẹ dia ẹbẹre ọvo rẹ akpọ na - a . — Se 1 Pita 2 : 13 - 17 .\n",
            "( Luk 4 : 18 ) Ọnana mudiaphiyọ egbomọphẹ vwọ kẹ ihwo re mu kpo eviẹn ?\n",
            "Ukuotọ rọyen , iniọvo mẹ eje de yono Baibol ji bromaphiyame kerẹ Iseri rẹ Jihova .\n",
            "O muẹro dẹn nẹ oma nabọ vwerhen Jihova kọke kọke rọ da mrẹ oborẹ avwanre davwan te , ra vwọ “ mọ ibi buebu ” !\n",
            "Ẹkẹvuọvo , Josẹf ọyen odibo rẹ Jesu ro se dje oma phia vwẹ azagba - a .\n",
            "==> dev.bpe.en <==\n",
            "These or@@ ch@@ es@@ tra@@ l arrang@@ ements are com@@ po@@ sed in such a way that they will pre@@ p@@ are our heart and mind for the pro@@ gra@@ m to follow .\n",
            "Today he is serving at Bethel .\n",
            "But freedom from what ?\n",
            "A@@ vo@@ id com@@ par@@ ing your new congregation with your pre@@ vi@@ ous one .\n",
            "2 : 16 , 17 .\n",
            "As stated , the v@@ indic@@ ation of Jehovah’s sovereignty is a v@@ ital issue invol@@ ving mankind .\n",
            "That is especially so if our trea@@ ch@@ er@@ ous heart tu@@ g@@ s us in the opp@@ o@@ sit@@ e direction .\n",
            "At times , this resul@@ ted in more money going out than coming in for a peri@@ od of time .\n",
            "How did hope re@@ infor@@ ce No@@ ah’s faith ?\n",
            "What prom@@ p@@ ts a mother to care ten@@ der@@ ly for her new@@ born b@@ ab@@ y ?\n",
            "\n",
            "==> dev.bpe.urh <==\n",
            "E ru u@@ hworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọ@@ rhuẹrẹ@@ phiyọ rẹ ẹdẹ yena .\n",
            "Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "Wọ vwẹ ukoko kpokpọ na vwọ vw@@ an@@ vw@@ en ọ rẹ wo nurhe na - a .\n",
            "2 : 16 , 17 .\n",
            "Kirobo ra t@@ arọ jovwo , eti@@ to rẹ usuon rẹ Jihova , ọyen ota ọghanghanre ro fori nẹ ihworakpọ tẹnrovi .\n",
            "M@@ a rho , udu avwanre rọ vọnre vẹ o@@ phi@@ ẹnvwe na da vuẹ avwanre nẹ e ru obo re chọre .\n",
            "Iruo kpokpọ nana nẹrhẹ a ghwọrọ vrẹ obo re tor@@ ori ọkiọvo .\n",
            "Mavọ yen iphiẹrophiyọ vwọ nẹrhẹ esegbuyota rẹ Noa ganphiyọ ?\n",
            "Die yen mu oni vwọ vwẹrote ọmọ ro ghwe vwiẹ ?\n",
            "\n",
            "==> dev.en <==\n",
            "These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "Today he is serving at Bethel .\n",
            "But freedom from what ?\n",
            "Avoid comparing your new congregation with your previous one .\n",
            "2 : 16 , 17 .\n",
            "As stated , the vindication of Jehovah’s sovereignty is a vital issue involving mankind .\n",
            "That is especially so if our treacherous heart tugs us in the opposite direction .\n",
            "At times , this resulted in more money going out than coming in for a period of time .\n",
            "How did hope reinforce Noah’s faith ?\n",
            "What prompts a mother to care tenderly for her newborn baby ?\n",
            "\n",
            "==> dev.urh <==\n",
            "E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2 : 16 , 17 .\n",
            "Kirobo ra tarọ jovwo , etito rẹ usuon rẹ Jihova , ọyen ota ọghanghanre ro fori nẹ ihworakpọ tẹnrovi .\n",
            "Ma rho , udu avwanre rọ vọnre vẹ ophiẹnvwe na da vuẹ avwanre nẹ e ru obo re chọre .\n",
            "Iruo kpokpọ nana nẹrhẹ a ghwọrọ vrẹ obo re torori ọkiọvo .\n",
            "Mavọ yen iphiẹrophiyọ vwọ nẹrhẹ esegbuyota rẹ Noa ganphiyọ ?\n",
            "Die yen mu oni vwọ vwẹrote ọmọ ro ghwe vwiẹ ?\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "epeCydmCyS8X"
      },
      "source": [
        "\n",
        "\n",
        "---\n",
        "\n",
        "\n",
        "## Installation of JoeyNMT\n",
        "\n",
        "JoeyNMT is a simple, minimalist NMT package which is useful for learning and teaching. Check out the documentation for JoeyNMT [here](https://joeynmt.readthedocs.io)  "
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "iBRMm4kMxZ8L",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 1000
        },
        "outputId": "0aa61180-5002-4396-b2d2-2e0744e5057b"
      },
      "source": [
        "# Install JoeyNMT\n",
        "! git clone https://github.com/joeynmt/joeynmt.git\n",
        "! cd joeynmt; pip3 install ."
      ],
      "execution_count": 37,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "fatal: destination path 'joeynmt' already exists and is not an empty directory.\n",
            "Processing /content/joeynmt\n",
            "Requirement already satisfied: future in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (0.16.0)\n",
            "Requirement already satisfied: pillow in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (4.3.0)\n",
            "Requirement already satisfied: numpy<2.0,>=1.14.5 in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (1.17.4)\n",
            "Requirement already satisfied: setuptools>=41.0.0 in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (42.0.2)\n",
            "Requirement already satisfied: torch>=1.1 in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (1.3.1)\n",
            "Requirement already satisfied: tensorflow>=1.14 in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (1.15.0)\n",
            "Requirement already satisfied: torchtext in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (0.3.1)\n",
            "Requirement already satisfied: sacrebleu>=1.3.6 in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (1.4.3)\n",
            "Requirement already satisfied: subword-nmt in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (0.3.7)\n",
            "Requirement already satisfied: matplotlib in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (3.1.2)\n",
            "Requirement already satisfied: seaborn in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (0.9.0)\n",
            "Requirement already satisfied: pyyaml>=5.1 in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (5.2)\n",
            "Requirement already satisfied: pylint in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (2.4.4)\n",
            "Requirement already satisfied: six==1.12 in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (1.12.0)\n",
            "Requirement already satisfied: olefile in /usr/local/lib/python3.6/dist-packages (from pillow->joeynmt==0.0.1) (0.46)\n",
            "Requirement already satisfied: termcolor>=1.1.0 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (1.1.0)\n",
            "Requirement already satisfied: gast==0.2.2 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (0.2.2)\n",
            "Requirement already satisfied: tensorflow-estimator==1.15.1 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (1.15.1)\n",
            "Requirement already satisfied: opt-einsum>=2.3.2 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (3.1.0)\n",
            "Requirement already satisfied: astor>=0.6.0 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (0.8.1)\n",
            "Requirement already satisfied: keras-applications>=1.0.8 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (1.0.8)\n",
            "Requirement already satisfied: google-pasta>=0.1.6 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (0.1.8)\n",
            "Requirement already satisfied: keras-preprocessing>=1.0.5 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (1.1.0)\n",
            "Requirement already satisfied: wrapt>=1.11.1 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (1.11.2)\n",
            "Requirement already satisfied: wheel>=0.26 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (0.33.6)\n",
            "Requirement already satisfied: grpcio>=1.8.6 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (1.15.0)\n",
            "Requirement already satisfied: protobuf>=3.6.1 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (3.10.0)\n",
            "Requirement already satisfied: absl-py>=0.7.0 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (0.8.1)\n",
            "Requirement already satisfied: tensorboard<1.16.0,>=1.15.0 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (1.15.0)\n",
            "Requirement already satisfied: tqdm in /usr/local/lib/python3.6/dist-packages (from torchtext->joeynmt==0.0.1) (4.28.1)\n",
            "Requirement already satisfied: requests in /usr/local/lib/python3.6/dist-packages (from torchtext->joeynmt==0.0.1) (2.21.0)\n",
            "Requirement already satisfied: typing in /usr/local/lib/python3.6/dist-packages (from sacrebleu>=1.3.6->joeynmt==0.0.1) (3.6.6)\n",
            "Requirement already satisfied: portalocker in /usr/local/lib/python3.6/dist-packages (from sacrebleu>=1.3.6->joeynmt==0.0.1) (1.5.2)\n",
            "Requirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib->joeynmt==0.0.1) (2.4.5)\n",
            "Requirement already satisfied: kiwisolver>=1.0.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib->joeynmt==0.0.1) (1.1.0)\n",
            "Requirement already satisfied: cycler>=0.10 in /usr/local/lib/python3.6/dist-packages (from matplotlib->joeynmt==0.0.1) (0.10.0)\n",
            "Requirement already satisfied: python-dateutil>=2.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib->joeynmt==0.0.1) (2.6.1)\n",
            "Requirement already satisfied: pandas>=0.15.2 in /usr/local/lib/python3.6/dist-packages (from seaborn->joeynmt==0.0.1) (0.25.3)\n",
            "Requirement already satisfied: scipy>=0.14.0 in /usr/local/lib/python3.6/dist-packages (from seaborn->joeynmt==0.0.1) (1.3.3)\n",
            "Requirement already satisfied: mccabe<0.7,>=0.6 in /usr/local/lib/python3.6/dist-packages (from pylint->joeynmt==0.0.1) (0.6.1)\n",
            "Requirement already satisfied: isort<5,>=4.2.5 in /usr/local/lib/python3.6/dist-packages (from pylint->joeynmt==0.0.1) (4.3.21)\n",
            "Requirement already satisfied: astroid<2.4,>=2.3.0 in /usr/local/lib/python3.6/dist-packages (from pylint->joeynmt==0.0.1) (2.3.3)\n",
            "Requirement already satisfied: h5py in /usr/local/lib/python3.6/dist-packages (from keras-applications>=1.0.8->tensorflow>=1.14->joeynmt==0.0.1) (2.8.0)\n",
            "Requirement already satisfied: werkzeug>=0.11.15 in /usr/local/lib/python3.6/dist-packages (from tensorboard<1.16.0,>=1.15.0->tensorflow>=1.14->joeynmt==0.0.1) (0.16.0)\n",
            "Requirement already satisfied: markdown>=2.6.8 in /usr/local/lib/python3.6/dist-packages (from tensorboard<1.16.0,>=1.15.0->tensorflow>=1.14->joeynmt==0.0.1) (3.1.1)\n",
            "Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests->torchtext->joeynmt==0.0.1) (3.0.4)\n",
            "Requirement already satisfied: urllib3<1.25,>=1.21.1 in /usr/local/lib/python3.6/dist-packages (from requests->torchtext->joeynmt==0.0.1) (1.24.3)\n",
            "Requirement already satisfied: idna<2.9,>=2.5 in /usr/local/lib/python3.6/dist-packages (from requests->torchtext->joeynmt==0.0.1) (2.8)\n",
            "Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.6/dist-packages (from requests->torchtext->joeynmt==0.0.1) (2019.11.28)\n",
            "Requirement already satisfied: pytz>=2017.2 in /usr/local/lib/python3.6/dist-packages (from pandas>=0.15.2->seaborn->joeynmt==0.0.1) (2018.9)\n",
            "Requirement already satisfied: lazy-object-proxy==1.4.* in /usr/local/lib/python3.6/dist-packages (from astroid<2.4,>=2.3.0->pylint->joeynmt==0.0.1) (1.4.3)\n",
            "Requirement already satisfied: typed-ast<1.5,>=1.4.0; implementation_name == \"cpython\" and python_version < \"3.8\" in /usr/local/lib/python3.6/dist-packages (from astroid<2.4,>=2.3.0->pylint->joeynmt==0.0.1) (1.4.0)\n",
            "Building wheels for collected packages: joeynmt\n",
            "  Building wheel for joeynmt (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
            "  Created wheel for joeynmt: filename=joeynmt-0.0.1-cp36-none-any.whl size=72136 sha256=3cc6a4de7274fdcab65f90e3af85b3579607dee1b0258e3b260af28dd3e0bb15\n",
            "  Stored in directory: /tmp/pip-ephem-wheel-cache-dfaqemza/wheels/db/01/db/751cc9f3e7f6faec127c43644ba250a3ea7ad200594aeda70a\n",
            "Successfully built joeynmt\n",
            "Installing collected packages: joeynmt\n",
            "  Found existing installation: joeynmt 0.0.1\n",
            "    Uninstalling joeynmt-0.0.1:\n",
            "      Successfully uninstalled joeynmt-0.0.1\n",
            "Successfully installed joeynmt-0.0.1\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "AaE77Tcppex9"
      },
      "source": [
        "# Preprocessing the Data into Subword BPE Tokens\n",
        "\n",
        "- One of the most powerful improvements for agglutinative languages (a feature of most Bantu languages) is using BPE tokenization [ (Sennrich, 2015) ](https://arxiv.org/abs/1508.07909).\n",
        "\n",
        "- It was also shown that by optimizing the umber of BPE codes we significantly improve results for low-resourced languages [(Sennrich, 2019)](https://www.aclweb.org/anthology/P19-1021) [(Martinus, 2019)](https://arxiv.org/abs/1906.05685)\n",
        "\n",
        "- Below we have the scripts for doing BPE tokenization of our data. We use 4000 tokens as recommended by [(Sennrich, 2019)](https://www.aclweb.org/anthology/P19-1021). You do not need to change anything. Simply running the below will be suitable. "
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "0DhFg6tlqVW5",
        "colab_type": "code",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 204
        },
        "outputId": "6cfdc1c4-5eb8-4459-82ef-aee13899953a"
      },
      "source": [
        "!ls drive/'My Drive'/masakhane/en-urh-baseline/models/enurh_transformer"
      ],
      "execution_count": 38,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "00001000.hyps.dev   15000.hyps\t25000.hyps  35000.hyps\t7000.ckpt\n",
            "00001000.hyps.test  16000.hyps\t26000.hyps  36000.hyps\t7000.hyps\n",
            "00007000.hyps.dev   17000.hyps\t27000.hyps  37000.hyps\t8000.hyps\n",
            "00007000.hyps.test  18000.hyps\t28000.hyps  38000.hyps\t9000.hyps\n",
            "10000.hyps\t    19000.hyps\t29000.hyps  39000.hyps\tconfig.yaml\n",
            "1000.ckpt\t    20000.hyps\t30000.hyps  40000.hyps\tsrc_vocab.txt\n",
            "1000.hyps\t    2000.hyps\t3000.hyps   4000.hyps\ttensorboard\n",
            "11000.hyps\t    21000.hyps\t31000.hyps  5000.ckpt\ttrain.log\n",
            "12000.hyps\t    22000.hyps\t32000.hyps  5000.hyps\ttrg_vocab.txt\n",
            "13000.hyps\t    23000.hyps\t33000.hyps  6000.ckpt\tvalidations.txt\n",
            "14000.hyps\t    24000.hyps\t34000.hyps  6000.hyps\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "H-TyjtmXB1mL",
        "colab": {}
      },
      "source": [
        "# ##### IOHAVOC MODIFICATIONS ==>>  WE DO NOT WANT TO DO BPE\n",
        "\n",
        "# # One of the huge boosts in NMT performance was to use a different method of tokenizing. \n",
        "# # Usually, NMT would tokenize by words. However, using a method called BPE gave amazing boosts to performance\n",
        "\n",
        "# # Do subword NMT \n",
        "# from os import path\n",
        "# os.environ[\"src\"] = source_language # Sets them in bash as well, since we often use bash scripts\n",
        "# os.environ[\"tgt\"] = target_language\n",
        "\n",
        "# os.environ[\"data_path\"] = path.join(\"joeynmt\", \"data\", source_language + target_language) # Herman! \n",
        "\n",
        "# # Learn BPEs on the training data.\n",
        "# ! subword-nmt learn-joint-bpe-and-vocab --input train.$src train.$tgt -s 4000 -o bpe.codes.4000 --write-vocabulary vocab.$src vocab.$tgt\n",
        "\n",
        "# # Apply BPE splits to the development and test data.\n",
        "# ! subword-nmt apply-bpe -c bpe.codes.4000 --vocabulary vocab.$src < train.$src > train.bpe.$src\n",
        "# ! subword-nmt apply-bpe -c bpe.codes.4000 --vocabulary vocab.$tgt < train.$tgt > train.bpe.$tgt\n",
        "\n",
        "# ! subword-nmt apply-bpe -c bpe.codes.4000 --vocabulary vocab.$src < dev.$src > dev.bpe.$src\n",
        "# ! subword-nmt apply-bpe -c bpe.codes.4000 --vocabulary vocab.$tgt < dev.$tgt > dev.bpe.$tgt\n",
        "# ! subword-nmt apply-bpe -c bpe.codes.4000 --vocabulary vocab.$src < test.$src > test.bpe.$src\n",
        "# ! subword-nmt apply-bpe -c bpe.codes.4000 --vocabulary vocab.$tgt < test.$tgt > test.bpe.$tgt\n",
        "\n",
        "# # Create directory, move everyone we care about to the correct location\n",
        "# ! mkdir -p $data_path\n",
        "# ! cp train.* $data_path\n",
        "# ! cp test.* $data_path\n",
        "# ! cp dev.* $data_path\n",
        "# ! cp bpe.codes.4000 $data_path\n",
        "# ! ls $data_path\n",
        "\n",
        "# # Also move everything we care about to a mounted location in google drive (relevant if running in colab) at gdrive_path\n",
        "# ! cp train.* \"$gdrive_path\"\n",
        "# ! cp test.* \"$gdrive_path\"\n",
        "# ! cp dev.* \"$gdrive_path\"\n",
        "# ! cp bpe.codes.4000 \"$gdrive_path\"\n",
        "# ! ls \"$gdrive_path\"\n",
        "\n",
        "# # Create that vocab using build_vocab\n",
        "# ! sudo chmod 777 joeynmt/scripts/build_vocab.py\n",
        "# ! joeynmt/scripts/build_vocab.py joeynmt/data/$src$tgt/train.bpe.$src joeynmt/data/$src$tgt/train.bpe.$tgt --output_path joeynmt/data/$src$tgt/vocab.txt\n",
        "\n",
        "# # Some output\n",
        "# ! echo \"BPE Urhobo Sentences\"\n",
        "# ! tail -n 5 test.bpe.$tgt\n",
        "# ! echo \"Combined BPE Vocab\"\n",
        "# ! tail -n 10 joeynmt/data/$src$tgt/vocab.txt  # Herman"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "gRiUoc_ryUR8",
        "colab_type": "code",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 442
        },
        "outputId": "e379ef04-7488-491c-8880-e677943fe724"
      },
      "source": [
        "# ##### IOHAVOC MODIFICATIONS ==>> CREATE THE VOCAB FOR NON-BPE EXPERIMENTS\n",
        "from os import path\n",
        "\n",
        "os.environ[\"src\"] = source_language # Sets them in bash as well, since we often use bash scripts\n",
        "os.environ[\"tgt\"] = target_language\n",
        "os.environ[\"data_path\"] = path.join(\"joeynmt\", \"data\", source_language + target_language) # Herman! \n",
        "\n",
        "# Create directory, move everyone we care about to the correct location\n",
        "! mkdir -p $data_path\n",
        "! cp train.* $data_path\n",
        "! cp test.* $data_path\n",
        "! cp dev.* $data_path\n",
        "! ls $data_path\n",
        "\n",
        "# Also move everything we care about to a mounted location in google drive (relevant if running in colab) at gdrive_path\n",
        "! cp train.* \"$gdrive_path\"\n",
        "! cp test.* \"$gdrive_path\"\n",
        "! cp dev.* \"$gdrive_path\"\n",
        "! ls \"$gdrive_path\"\n",
        "\n",
        "! sudo chmod 777 joeynmt/scripts/build_vocab.py\n",
        "! joeynmt/scripts/build_vocab.py joeynmt/data/$src$tgt/train.$src joeynmt/data/$src$tgt/train.$tgt --output_path joeynmt/data/$src$tgt/vocab-nonBPE.txt\n",
        "\n",
        "# Some output\n",
        "! echo \"Urhobo Sentences\"\n",
        "! tail -n 5 test.$tgt\n",
        "! echo \"Combined Vocab\"\n",
        "! tail -n 10 joeynmt/data/$src$tgt/vocab-nonBPE.txt  # Herman"
      ],
      "execution_count": 40,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "bpe.codes.4000\tdev.urh       test.en-any.en\ttrain.bpe.urh\t  vocab.txt\n",
            "dev.bpe.en\ttest.bpe.en   test.en-any.en.1\ttrain.en\n",
            "dev.bpe.urh\ttest.bpe.urh  test.urh\t\ttrain.urh\n",
            "dev.en\t\ttest.en       train.bpe.en\tvocab-nonBPE.txt\n",
            "bpe.codes.4000\tdev.urh       test.en\t\ttrain.bpe.en   vocab-nonBPE.txt\n",
            "dev.bpe.en\tmodels\t      test.en-any.en\ttrain.bpe.urh\n",
            "dev.bpe.urh\ttest.bpe.en   test.en-any.en.1\ttrain.en\n",
            "dev.en\t\ttest.bpe.urh  test.urh\t\ttrain.urh\n",
            "Urhobo Sentences\n",
            "Diesorọ Hushai vwọ guọnọ uduefiogbere ọ sa vwọ fuevun kẹ Ọghẹnẹ ?\n",
            "Diesorọ ọ vwọ guọnọ uduefiogbere avwanre ke sa fuevun ?\n",
            "Me nẹrhovwo vwọ kẹ uduefiogbere me sa vwọ yọnregan .\n",
            "Enẹna , ẹwẹn rayen kpotọ re , me sa kọn bru ayen ra ọkieje . ” — Se Isẹ 29 : 25 .\n",
            "[ 1 ] ( ẹkorota 7 ) E wene edẹ evo .\n",
            "Combined Vocab\n",
            "devilish\n",
            "mutidia\n",
            "intrusions\n",
            "Motivated\n",
            "slope\n",
            "subtracted\n",
            "concentrations\n",
            "patches\n",
            "blooms\n",
            "ọviẹ\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "qdZ_lamIBZva",
        "colab_type": "code",
        "colab": {}
      },
      "source": [
        "!cp joeynmt/data/$src$tgt/vocab-nonBPE.txt \"$gdrive_path\""
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "IlMitUHR8Qy-",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 85
        },
        "outputId": "9d926509-f30d-4ccf-98a0-b4eb340974a3"
      },
      "source": [
        "# Also move everything we care about to a mounted location in google drive (relevant if running in colab) at gdrive_path\n",
        "! cp train.* \"$gdrive_path\"\n",
        "! cp test.* \"$gdrive_path\"\n",
        "! cp dev.* \"$gdrive_path\"\n",
        "! cp bpe.codes.4000 \"$gdrive_path\"\n",
        "! ls \"$gdrive_path\""
      ],
      "execution_count": 42,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "bpe.codes.4000\tdev.urh       test.en\t\ttrain.bpe.en   vocab-nonBPE.txt\n",
            "dev.bpe.en\tmodels\t      test.en-any.en\ttrain.bpe.urh\n",
            "dev.bpe.urh\ttest.bpe.en   test.en-any.en.1\ttrain.en\n",
            "dev.en\t\ttest.bpe.urh  test.urh\t\ttrain.urh\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "Ixmzi60WsUZ8"
      },
      "source": [
        "# Creating the JoeyNMT Config\n",
        "\n",
        "JoeyNMT requires a yaml config. We provide a template below. We've also set a number of defaults with it, that you may play with!\n",
        "\n",
        "- We used Transformer architecture \n",
        "- We set our dropout to reasonably high: 0.3 (recommended in  [(Sennrich, 2019)](https://www.aclweb.org/anthology/P19-1021))\n",
        "\n",
        "Things worth playing with:\n",
        "- The batch size (also recommended to change for low-resourced languages)\n",
        "- The number of epochs (we've set it at 30 just so it runs in about an hour, for testing purposes)\n",
        "- The decoder options (beam_size, alpha)\n",
        "- Evaluation metrics (BLEU versus Crhf4)"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "PIs1lY2hxMsl",
        "colab": {}
      },
      "source": [
        "# This creates the config file for our JoeyNMT system. It might seem overwhelming so we've provided a couple of useful parameters you'll need to update\n",
        "# (You can of course play with all the parameters if you'd like!)\n",
        "\n",
        "name = '%s%s' % (source_language, target_language)\n",
        "gdrive_path = os.environ[\"gdrive_path\"]\n",
        "\n",
        "# Create the config\n",
        "config = \"\"\"\n",
        "name: \"{name}_transformer\"\n",
        "\n",
        "data:\n",
        "    src: \"{source_language}\"\n",
        "    trg: \"{target_language}\"\n",
        "    train: \"data/{name}/train\"\n",
        "    dev:   \"data/{name}/dev\"\n",
        "    test:  \"data/{name}/test\"\n",
        "    level: \"word\"\n",
        "    lowercase: False\n",
        "    max_sent_length: 100\n",
        "    src_vocab: \"data/{name}/vocab-nonBPE.txt\"\n",
        "    trg_vocab: \"data/{name}/vocab-nonBPE.txt\"\n",
        "\n",
        "testing:\n",
        "    beam_size: 5\n",
        "    alpha: 1.0\n",
        "\n",
        "training:\n",
        "    #load_model: \"{gdrive_path}/models/{name}_transformer/1.ckpt\" # if uncommented, load a pre-trained model from this checkpoint\n",
        "    random_seed: 42\n",
        "    optimizer: \"adam\"\n",
        "    normalization: \"tokens\"\n",
        "    adam_betas: [0.9, 0.999] \n",
        "    scheduling: \"plateau\"           # TODO: try switching from plateau to Noam scheduling\n",
        "    patience: 5                     # For plateau: decrease learning rate by decrease_factor if validation score has not improved for this many validation rounds.\n",
        "    learning_rate_factor: 0.5       # factor for Noam scheduler (used with Transformer)\n",
        "    learning_rate_warmup: 1000      # warmup steps for Noam scheduler (used with Transformer)\n",
        "    decrease_factor: 0.7\n",
        "    loss: \"crossentropy\"\n",
        "    learning_rate: 0.0003\n",
        "    learning_rate_min: 0.00000001\n",
        "    weight_decay: 0.0\n",
        "    label_smoothing: 0.1\n",
        "    batch_size: 4096\n",
        "    batch_type: \"token\"\n",
        "    eval_batch_size: 3600\n",
        "    eval_batch_type: \"token\"\n",
        "    batch_multiplier: 1\n",
        "    early_stopping_metric: \"ppl\"\n",
        "    epochs: 150                     # TODO: Decrease for when playing around and checking of working. Around 30 is sufficient to check if its working at all\n",
        "    validation_freq: 1000           # TODO: Set to at least once per epoch.\n",
        "    logging_freq: 100\n",
        "    eval_metric: \"bleu\"\n",
        "    model_dir: \"models/{name}_transformer\"\n",
        "    overwrite: True                # TODO: Set to True if you want to overwrite possibly existing models. \n",
        "    shuffle: True\n",
        "    use_cuda: True\n",
        "    max_output_length: 100\n",
        "    print_valid_sents: [0, 1, 2, 3]\n",
        "    keep_last_ckpts: 3\n",
        "\n",
        "model:\n",
        "    initializer: \"xavier\"\n",
        "    bias_initializer: \"zeros\"\n",
        "    init_gain: 1.0\n",
        "    embed_initializer: \"xavier\"\n",
        "    embed_init_gain: 1.0\n",
        "    tied_embeddings: True\n",
        "    tied_softmax: True\n",
        "    encoder:\n",
        "        type: \"transformer\"\n",
        "        num_layers: 6\n",
        "        num_heads: 4             # TODO: Increase to 8 for larger data.\n",
        "        embeddings:\n",
        "            embedding_dim: 256   # TODO: Increase to 512 for larger data.\n",
        "            scale: True\n",
        "            dropout: 0.2\n",
        "        # typically ff_size = 4 x hidden_size\n",
        "        hidden_size: 256         # TODO: Increase to 512 for larger data.\n",
        "        ff_size: 1024            # TODO: Increase to 2048 for larger data.\n",
        "        dropout: 0.3\n",
        "    decoder:\n",
        "        type: \"transformer\"\n",
        "        num_layers: 6\n",
        "        num_heads: 4              # TODO: Increase to 8 for larger data.\n",
        "        embeddings:\n",
        "            embedding_dim: 256    # TODO: Increase to 512 for larger data.\n",
        "            scale: True\n",
        "            dropout: 0.2\n",
        "        # typically ff_size = 4 x hidden_size\n",
        "        hidden_size: 256         # TODO: Increase to 512 for larger data.\n",
        "        ff_size: 1024            # TODO: Increase to 2048 for larger data.\n",
        "        dropout: 0.3\n",
        "\"\"\".format(name=name, gdrive_path=os.environ[\"gdrive_path\"], source_language=source_language, target_language=target_language)\n",
        "with open(\"joeynmt/configs/transformer_{name}.yaml\".format(name=name),'w') as f:\n",
        "    f.write(config)"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "pIifxE3Qzuvs"
      },
      "source": [
        "# Train the Model\n",
        "\n",
        "This single line of joeynmt runs the training using the config we made above"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "6ZBPFwT94WpI",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 1000
        },
        "outputId": "06325004-515e-4e59-c1d5-e7298d667d68"
      },
      "source": [
        "# Train the model\n",
        "# You can press Ctrl-C to stop. And then run the next cell to save your checkpoints! \n",
        "!cd joeynmt; python3 -m joeynmt train configs/transformer_$src$tgt.yaml"
      ],
      "execution_count": 44,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "2019-12-31 06:52:11,457 Hello! This is Joey-NMT.\n",
            "2019-12-31 06:52:12,772 Total params: 16802560\n",
            "2019-12-31 06:52:12,773 Trainable parameters: ['decoder.layer_norm.bias', 'decoder.layer_norm.weight', 'decoder.layers.0.dec_layer_norm.bias', 'decoder.layers.0.dec_layer_norm.weight', 'decoder.layers.0.feed_forward.layer_norm.bias', 'decoder.layers.0.feed_forward.layer_norm.weight', 'decoder.layers.0.feed_forward.pwff_layer.0.bias', 'decoder.layers.0.feed_forward.pwff_layer.0.weight', 'decoder.layers.0.feed_forward.pwff_layer.3.bias', 'decoder.layers.0.feed_forward.pwff_layer.3.weight', 'decoder.layers.0.src_trg_att.k_layer.bias', 'decoder.layers.0.src_trg_att.k_layer.weight', 'decoder.layers.0.src_trg_att.output_layer.bias', 'decoder.layers.0.src_trg_att.output_layer.weight', 'decoder.layers.0.src_trg_att.q_layer.bias', 'decoder.layers.0.src_trg_att.q_layer.weight', 'decoder.layers.0.src_trg_att.v_layer.bias', 'decoder.layers.0.src_trg_att.v_layer.weight', 'decoder.layers.0.trg_trg_att.k_layer.bias', 'decoder.layers.0.trg_trg_att.k_layer.weight', 'decoder.layers.0.trg_trg_att.output_layer.bias', 'decoder.layers.0.trg_trg_att.output_layer.weight', 'decoder.layers.0.trg_trg_att.q_layer.bias', 'decoder.layers.0.trg_trg_att.q_layer.weight', 'decoder.layers.0.trg_trg_att.v_layer.bias', 'decoder.layers.0.trg_trg_att.v_layer.weight', 'decoder.layers.0.x_layer_norm.bias', 'decoder.layers.0.x_layer_norm.weight', 'decoder.layers.1.dec_layer_norm.bias', 'decoder.layers.1.dec_layer_norm.weight', 'decoder.layers.1.feed_forward.layer_norm.bias', 'decoder.layers.1.feed_forward.layer_norm.weight', 'decoder.layers.1.feed_forward.pwff_layer.0.bias', 'decoder.layers.1.feed_forward.pwff_layer.0.weight', 'decoder.layers.1.feed_forward.pwff_layer.3.bias', 'decoder.layers.1.feed_forward.pwff_layer.3.weight', 'decoder.layers.1.src_trg_att.k_layer.bias', 'decoder.layers.1.src_trg_att.k_layer.weight', 'decoder.layers.1.src_trg_att.output_layer.bias', 'decoder.layers.1.src_trg_att.output_layer.weight', 'decoder.layers.1.src_trg_att.q_layer.bias', 'decoder.layers.1.src_trg_att.q_layer.weight', 'decoder.layers.1.src_trg_att.v_layer.bias', 'decoder.layers.1.src_trg_att.v_layer.weight', 'decoder.layers.1.trg_trg_att.k_layer.bias', 'decoder.layers.1.trg_trg_att.k_layer.weight', 'decoder.layers.1.trg_trg_att.output_layer.bias', 'decoder.layers.1.trg_trg_att.output_layer.weight', 'decoder.layers.1.trg_trg_att.q_layer.bias', 'decoder.layers.1.trg_trg_att.q_layer.weight', 'decoder.layers.1.trg_trg_att.v_layer.bias', 'decoder.layers.1.trg_trg_att.v_layer.weight', 'decoder.layers.1.x_layer_norm.bias', 'decoder.layers.1.x_layer_norm.weight', 'decoder.layers.2.dec_layer_norm.bias', 'decoder.layers.2.dec_layer_norm.weight', 'decoder.layers.2.feed_forward.layer_norm.bias', 'decoder.layers.2.feed_forward.layer_norm.weight', 'decoder.layers.2.feed_forward.pwff_layer.0.bias', 'decoder.layers.2.feed_forward.pwff_layer.0.weight', 'decoder.layers.2.feed_forward.pwff_layer.3.bias', 'decoder.layers.2.feed_forward.pwff_layer.3.weight', 'decoder.layers.2.src_trg_att.k_layer.bias', 'decoder.layers.2.src_trg_att.k_layer.weight', 'decoder.layers.2.src_trg_att.output_layer.bias', 'decoder.layers.2.src_trg_att.output_layer.weight', 'decoder.layers.2.src_trg_att.q_layer.bias', 'decoder.layers.2.src_trg_att.q_layer.weight', 'decoder.layers.2.src_trg_att.v_layer.bias', 'decoder.layers.2.src_trg_att.v_layer.weight', 'decoder.layers.2.trg_trg_att.k_layer.bias', 'decoder.layers.2.trg_trg_att.k_layer.weight', 'decoder.layers.2.trg_trg_att.output_layer.bias', 'decoder.layers.2.trg_trg_att.output_layer.weight', 'decoder.layers.2.trg_trg_att.q_layer.bias', 'decoder.layers.2.trg_trg_att.q_layer.weight', 'decoder.layers.2.trg_trg_att.v_layer.bias', 'decoder.layers.2.trg_trg_att.v_layer.weight', 'decoder.layers.2.x_layer_norm.bias', 'decoder.layers.2.x_layer_norm.weight', 'decoder.layers.3.dec_layer_norm.bias', 'decoder.layers.3.dec_layer_norm.weight', 'decoder.layers.3.feed_forward.layer_norm.bias', 'decoder.layers.3.feed_forward.layer_norm.weight', 'decoder.layers.3.feed_forward.pwff_layer.0.bias', 'decoder.layers.3.feed_forward.pwff_layer.0.weight', 'decoder.layers.3.feed_forward.pwff_layer.3.bias', 'decoder.layers.3.feed_forward.pwff_layer.3.weight', 'decoder.layers.3.src_trg_att.k_layer.bias', 'decoder.layers.3.src_trg_att.k_layer.weight', 'decoder.layers.3.src_trg_att.output_layer.bias', 'decoder.layers.3.src_trg_att.output_layer.weight', 'decoder.layers.3.src_trg_att.q_layer.bias', 'decoder.layers.3.src_trg_att.q_layer.weight', 'decoder.layers.3.src_trg_att.v_layer.bias', 'decoder.layers.3.src_trg_att.v_layer.weight', 'decoder.layers.3.trg_trg_att.k_layer.bias', 'decoder.layers.3.trg_trg_att.k_layer.weight', 'decoder.layers.3.trg_trg_att.output_layer.bias', 'decoder.layers.3.trg_trg_att.output_layer.weight', 'decoder.layers.3.trg_trg_att.q_layer.bias', 'decoder.layers.3.trg_trg_att.q_layer.weight', 'decoder.layers.3.trg_trg_att.v_layer.bias', 'decoder.layers.3.trg_trg_att.v_layer.weight', 'decoder.layers.3.x_layer_norm.bias', 'decoder.layers.3.x_layer_norm.weight', 'decoder.layers.4.dec_layer_norm.bias', 'decoder.layers.4.dec_layer_norm.weight', 'decoder.layers.4.feed_forward.layer_norm.bias', 'decoder.layers.4.feed_forward.layer_norm.weight', 'decoder.layers.4.feed_forward.pwff_layer.0.bias', 'decoder.layers.4.feed_forward.pwff_layer.0.weight', 'decoder.layers.4.feed_forward.pwff_layer.3.bias', 'decoder.layers.4.feed_forward.pwff_layer.3.weight', 'decoder.layers.4.src_trg_att.k_layer.bias', 'decoder.layers.4.src_trg_att.k_layer.weight', 'decoder.layers.4.src_trg_att.output_layer.bias', 'decoder.layers.4.src_trg_att.output_layer.weight', 'decoder.layers.4.src_trg_att.q_layer.bias', 'decoder.layers.4.src_trg_att.q_layer.weight', 'decoder.layers.4.src_trg_att.v_layer.bias', 'decoder.layers.4.src_trg_att.v_layer.weight', 'decoder.layers.4.trg_trg_att.k_layer.bias', 'decoder.layers.4.trg_trg_att.k_layer.weight', 'decoder.layers.4.trg_trg_att.output_layer.bias', 'decoder.layers.4.trg_trg_att.output_layer.weight', 'decoder.layers.4.trg_trg_att.q_layer.bias', 'decoder.layers.4.trg_trg_att.q_layer.weight', 'decoder.layers.4.trg_trg_att.v_layer.bias', 'decoder.layers.4.trg_trg_att.v_layer.weight', 'decoder.layers.4.x_layer_norm.bias', 'decoder.layers.4.x_layer_norm.weight', 'decoder.layers.5.dec_layer_norm.bias', 'decoder.layers.5.dec_layer_norm.weight', 'decoder.layers.5.feed_forward.layer_norm.bias', 'decoder.layers.5.feed_forward.layer_norm.weight', 'decoder.layers.5.feed_forward.pwff_layer.0.bias', 'decoder.layers.5.feed_forward.pwff_layer.0.weight', 'decoder.layers.5.feed_forward.pwff_layer.3.bias', 'decoder.layers.5.feed_forward.pwff_layer.3.weight', 'decoder.layers.5.src_trg_att.k_layer.bias', 'decoder.layers.5.src_trg_att.k_layer.weight', 'decoder.layers.5.src_trg_att.output_layer.bias', 'decoder.layers.5.src_trg_att.output_layer.weight', 'decoder.layers.5.src_trg_att.q_layer.bias', 'decoder.layers.5.src_trg_att.q_layer.weight', 'decoder.layers.5.src_trg_att.v_layer.bias', 'decoder.layers.5.src_trg_att.v_layer.weight', 'decoder.layers.5.trg_trg_att.k_layer.bias', 'decoder.layers.5.trg_trg_att.k_layer.weight', 'decoder.layers.5.trg_trg_att.output_layer.bias', 'decoder.layers.5.trg_trg_att.output_layer.weight', 'decoder.layers.5.trg_trg_att.q_layer.bias', 'decoder.layers.5.trg_trg_att.q_layer.weight', 'decoder.layers.5.trg_trg_att.v_layer.bias', 'decoder.layers.5.trg_trg_att.v_layer.weight', 'decoder.layers.5.x_layer_norm.bias', 'decoder.layers.5.x_layer_norm.weight', 'encoder.layer_norm.bias', 'encoder.layer_norm.weight', 'encoder.layers.0.feed_forward.layer_norm.bias', 'encoder.layers.0.feed_forward.layer_norm.weight', 'encoder.layers.0.feed_forward.pwff_layer.0.bias', 'encoder.layers.0.feed_forward.pwff_layer.0.weight', 'encoder.layers.0.feed_forward.pwff_layer.3.bias', 'encoder.layers.0.feed_forward.pwff_layer.3.weight', 'encoder.layers.0.layer_norm.bias', 'encoder.layers.0.layer_norm.weight', 'encoder.layers.0.src_src_att.k_layer.bias', 'encoder.layers.0.src_src_att.k_layer.weight', 'encoder.layers.0.src_src_att.output_layer.bias', 'encoder.layers.0.src_src_att.output_layer.weight', 'encoder.layers.0.src_src_att.q_layer.bias', 'encoder.layers.0.src_src_att.q_layer.weight', 'encoder.layers.0.src_src_att.v_layer.bias', 'encoder.layers.0.src_src_att.v_layer.weight', 'encoder.layers.1.feed_forward.layer_norm.bias', 'encoder.layers.1.feed_forward.layer_norm.weight', 'encoder.layers.1.feed_forward.pwff_layer.0.bias', 'encoder.layers.1.feed_forward.pwff_layer.0.weight', 'encoder.layers.1.feed_forward.pwff_layer.3.bias', 'encoder.layers.1.feed_forward.pwff_layer.3.weight', 'encoder.layers.1.layer_norm.bias', 'encoder.layers.1.layer_norm.weight', 'encoder.layers.1.src_src_att.k_layer.bias', 'encoder.layers.1.src_src_att.k_layer.weight', 'encoder.layers.1.src_src_att.output_layer.bias', 'encoder.layers.1.src_src_att.output_layer.weight', 'encoder.layers.1.src_src_att.q_layer.bias', 'encoder.layers.1.src_src_att.q_layer.weight', 'encoder.layers.1.src_src_att.v_layer.bias', 'encoder.layers.1.src_src_att.v_layer.weight', 'encoder.layers.2.feed_forward.layer_norm.bias', 'encoder.layers.2.feed_forward.layer_norm.weight', 'encoder.layers.2.feed_forward.pwff_layer.0.bias', 'encoder.layers.2.feed_forward.pwff_layer.0.weight', 'encoder.layers.2.feed_forward.pwff_layer.3.bias', 'encoder.layers.2.feed_forward.pwff_layer.3.weight', 'encoder.layers.2.layer_norm.bias', 'encoder.layers.2.layer_norm.weight', 'encoder.layers.2.src_src_att.k_layer.bias', 'encoder.layers.2.src_src_att.k_layer.weight', 'encoder.layers.2.src_src_att.output_layer.bias', 'encoder.layers.2.src_src_att.output_layer.weight', 'encoder.layers.2.src_src_att.q_layer.bias', 'encoder.layers.2.src_src_att.q_layer.weight', 'encoder.layers.2.src_src_att.v_layer.bias', 'encoder.layers.2.src_src_att.v_layer.weight', 'encoder.layers.3.feed_forward.layer_norm.bias', 'encoder.layers.3.feed_forward.layer_norm.weight', 'encoder.layers.3.feed_forward.pwff_layer.0.bias', 'encoder.layers.3.feed_forward.pwff_layer.0.weight', 'encoder.layers.3.feed_forward.pwff_layer.3.bias', 'encoder.layers.3.feed_forward.pwff_layer.3.weight', 'encoder.layers.3.layer_norm.bias', 'encoder.layers.3.layer_norm.weight', 'encoder.layers.3.src_src_att.k_layer.bias', 'encoder.layers.3.src_src_att.k_layer.weight', 'encoder.layers.3.src_src_att.output_layer.bias', 'encoder.layers.3.src_src_att.output_layer.weight', 'encoder.layers.3.src_src_att.q_layer.bias', 'encoder.layers.3.src_src_att.q_layer.weight', 'encoder.layers.3.src_src_att.v_layer.bias', 'encoder.layers.3.src_src_att.v_layer.weight', 'encoder.layers.4.feed_forward.layer_norm.bias', 'encoder.layers.4.feed_forward.layer_norm.weight', 'encoder.layers.4.feed_forward.pwff_layer.0.bias', 'encoder.layers.4.feed_forward.pwff_layer.0.weight', 'encoder.layers.4.feed_forward.pwff_layer.3.bias', 'encoder.layers.4.feed_forward.pwff_layer.3.weight', 'encoder.layers.4.layer_norm.bias', 'encoder.layers.4.layer_norm.weight', 'encoder.layers.4.src_src_att.k_layer.bias', 'encoder.layers.4.src_src_att.k_layer.weight', 'encoder.layers.4.src_src_att.output_layer.bias', 'encoder.layers.4.src_src_att.output_layer.weight', 'encoder.layers.4.src_src_att.q_layer.bias', 'encoder.layers.4.src_src_att.q_layer.weight', 'encoder.layers.4.src_src_att.v_layer.bias', 'encoder.layers.4.src_src_att.v_layer.weight', 'encoder.layers.5.feed_forward.layer_norm.bias', 'encoder.layers.5.feed_forward.layer_norm.weight', 'encoder.layers.5.feed_forward.pwff_layer.0.bias', 'encoder.layers.5.feed_forward.pwff_layer.0.weight', 'encoder.layers.5.feed_forward.pwff_layer.3.bias', 'encoder.layers.5.feed_forward.pwff_layer.3.weight', 'encoder.layers.5.layer_norm.bias', 'encoder.layers.5.layer_norm.weight', 'encoder.layers.5.src_src_att.k_layer.bias', 'encoder.layers.5.src_src_att.k_layer.weight', 'encoder.layers.5.src_src_att.output_layer.bias', 'encoder.layers.5.src_src_att.output_layer.weight', 'encoder.layers.5.src_src_att.q_layer.bias', 'encoder.layers.5.src_src_att.q_layer.weight', 'encoder.layers.5.src_src_att.v_layer.bias', 'encoder.layers.5.src_src_att.v_layer.weight', 'src_embed.lut.weight']\n",
            "2019-12-31 06:52:15,793 cfg.name                           : enurh_transformer\n",
            "2019-12-31 06:52:15,793 cfg.data.src                       : en\n",
            "2019-12-31 06:52:15,793 cfg.data.trg                       : urh\n",
            "2019-12-31 06:52:15,793 cfg.data.train                     : data/enurh/train\n",
            "2019-12-31 06:52:15,793 cfg.data.dev                       : data/enurh/dev\n",
            "2019-12-31 06:52:15,793 cfg.data.test                      : data/enurh/test\n",
            "2019-12-31 06:52:15,793 cfg.data.level                     : word\n",
            "2019-12-31 06:52:15,793 cfg.data.lowercase                 : False\n",
            "2019-12-31 06:52:15,793 cfg.data.max_sent_length           : 100\n",
            "2019-12-31 06:52:15,793 cfg.data.src_vocab                 : data/enurh/vocab-nonBPE.txt\n",
            "2019-12-31 06:52:15,793 cfg.data.trg_vocab                 : data/enurh/vocab-nonBPE.txt\n",
            "2019-12-31 06:52:15,793 cfg.testing.beam_size              : 5\n",
            "2019-12-31 06:52:15,793 cfg.testing.alpha                  : 1.0\n",
            "2019-12-31 06:52:15,793 cfg.training.random_seed           : 42\n",
            "2019-12-31 06:52:15,793 cfg.training.optimizer             : adam\n",
            "2019-12-31 06:52:15,793 cfg.training.normalization         : tokens\n",
            "2019-12-31 06:52:15,793 cfg.training.adam_betas            : [0.9, 0.999]\n",
            "2019-12-31 06:52:15,794 cfg.training.scheduling            : plateau\n",
            "2019-12-31 06:52:15,794 cfg.training.patience              : 5\n",
            "2019-12-31 06:52:15,794 cfg.training.learning_rate_factor  : 0.5\n",
            "2019-12-31 06:52:15,794 cfg.training.learning_rate_warmup  : 1000\n",
            "2019-12-31 06:52:15,794 cfg.training.decrease_factor       : 0.7\n",
            "2019-12-31 06:52:15,794 cfg.training.loss                  : crossentropy\n",
            "2019-12-31 06:52:15,794 cfg.training.learning_rate         : 0.0003\n",
            "2019-12-31 06:52:15,794 cfg.training.learning_rate_min     : 1e-08\n",
            "2019-12-31 06:52:15,794 cfg.training.weight_decay          : 0.0\n",
            "2019-12-31 06:52:15,794 cfg.training.label_smoothing       : 0.1\n",
            "2019-12-31 06:52:15,794 cfg.training.batch_size            : 4096\n",
            "2019-12-31 06:52:15,794 cfg.training.batch_type            : token\n",
            "2019-12-31 06:52:15,794 cfg.training.eval_batch_size       : 3600\n",
            "2019-12-31 06:52:15,794 cfg.training.eval_batch_type       : token\n",
            "2019-12-31 06:52:15,794 cfg.training.batch_multiplier      : 1\n",
            "2019-12-31 06:52:15,794 cfg.training.early_stopping_metric : ppl\n",
            "2019-12-31 06:52:15,794 cfg.training.epochs                : 150\n",
            "2019-12-31 06:52:15,794 cfg.training.validation_freq       : 1000\n",
            "2019-12-31 06:52:15,794 cfg.training.logging_freq          : 100\n",
            "2019-12-31 06:52:15,794 cfg.training.eval_metric           : bleu\n",
            "2019-12-31 06:52:15,794 cfg.training.model_dir             : models/enurh_transformer\n",
            "2019-12-31 06:52:15,794 cfg.training.overwrite             : True\n",
            "2019-12-31 06:52:15,794 cfg.training.shuffle               : True\n",
            "2019-12-31 06:52:15,794 cfg.training.use_cuda              : True\n",
            "2019-12-31 06:52:15,795 cfg.training.max_output_length     : 100\n",
            "2019-12-31 06:52:15,795 cfg.training.print_valid_sents     : [0, 1, 2, 3]\n",
            "2019-12-31 06:52:15,795 cfg.training.keep_last_ckpts       : 3\n",
            "2019-12-31 06:52:15,795 cfg.model.initializer              : xavier\n",
            "2019-12-31 06:52:15,795 cfg.model.bias_initializer         : zeros\n",
            "2019-12-31 06:52:15,795 cfg.model.init_gain                : 1.0\n",
            "2019-12-31 06:52:15,795 cfg.model.embed_initializer        : xavier\n",
            "2019-12-31 06:52:15,795 cfg.model.embed_init_gain          : 1.0\n",
            "2019-12-31 06:52:15,795 cfg.model.tied_embeddings          : True\n",
            "2019-12-31 06:52:15,795 cfg.model.tied_softmax             : True\n",
            "2019-12-31 06:52:15,795 cfg.model.encoder.type             : transformer\n",
            "2019-12-31 06:52:15,795 cfg.model.encoder.num_layers       : 6\n",
            "2019-12-31 06:52:15,795 cfg.model.encoder.num_heads        : 4\n",
            "2019-12-31 06:52:15,795 cfg.model.encoder.embeddings.embedding_dim : 256\n",
            "2019-12-31 06:52:15,795 cfg.model.encoder.embeddings.scale : True\n",
            "2019-12-31 06:52:15,795 cfg.model.encoder.embeddings.dropout : 0.2\n",
            "2019-12-31 06:52:15,795 cfg.model.encoder.hidden_size      : 256\n",
            "2019-12-31 06:52:15,795 cfg.model.encoder.ff_size          : 1024\n",
            "2019-12-31 06:52:15,795 cfg.model.encoder.dropout          : 0.3\n",
            "2019-12-31 06:52:15,795 cfg.model.decoder.type             : transformer\n",
            "2019-12-31 06:52:15,795 cfg.model.decoder.num_layers       : 6\n",
            "2019-12-31 06:52:15,795 cfg.model.decoder.num_heads        : 4\n",
            "2019-12-31 06:52:15,795 cfg.model.decoder.embeddings.embedding_dim : 256\n",
            "2019-12-31 06:52:15,796 cfg.model.decoder.embeddings.scale : True\n",
            "2019-12-31 06:52:15,796 cfg.model.decoder.embeddings.dropout : 0.2\n",
            "2019-12-31 06:52:15,796 cfg.model.decoder.hidden_size      : 256\n",
            "2019-12-31 06:52:15,796 cfg.model.decoder.ff_size          : 1024\n",
            "2019-12-31 06:52:15,796 cfg.model.decoder.dropout          : 0.3\n",
            "2019-12-31 06:52:15,796 Data set sizes: \n",
            "\ttrain 25608,\n",
            "\tvalid 1000,\n",
            "\ttest 2652\n",
            "2019-12-31 06:52:15,796 First training example:\n",
            "\t[SRC] The number of publishers is now about ten times what it was when I began serving here .\n",
            "\t[TRG] Ighwoghwota rehẹ ẹkuotọ na enẹna vwẹ ọhwọhwọ ihwe vwo bun vrẹ obo rọ hepha ọke me vwọ ga vwẹ oboyin .\n",
            "2019-12-31 06:52:15,796 First 10 words (src): (0) <unk> (1) <pad> (2) <s> (3) </s> (4) . (5) , (6) rẹ (7) the (8) to (9) na\n",
            "2019-12-31 06:52:15,796 First 10 words (trg): (0) <unk> (1) <pad> (2) <s> (3) </s> (4) . (5) , (6) rẹ (7) the (8) to (9) na\n",
            "2019-12-31 06:52:15,796 Number of Src words (types): 22431\n",
            "2019-12-31 06:52:15,797 Number of Trg words (types): 22431\n",
            "2019-12-31 06:52:15,797 Model(\n",
            "\tencoder=TransformerEncoder(num_layers=6, num_heads=4),\n",
            "\tdecoder=TransformerDecoder(num_layers=6, num_heads=4),\n",
            "\tsrc_embed=Embeddings(embedding_dim=256, vocab_size=22431),\n",
            "\ttrg_embed=Embeddings(embedding_dim=256, vocab_size=22431))\n",
            "2019-12-31 06:52:15,812 EPOCH 1\n",
            "2019-12-31 06:52:30,374 Epoch   1 Step:      100 Batch Loss:     5.427601 Tokens per Sec:    13843, Lr: 0.000300\n",
            "2019-12-31 06:52:45,093 Epoch   1 Step:      200 Batch Loss:     5.082472 Tokens per Sec:    13882, Lr: 0.000300\n",
            "2019-12-31 06:52:55,258 Epoch   1: total training loss 1463.09\n",
            "2019-12-31 06:52:55,258 EPOCH 2\n",
            "2019-12-31 06:52:59,757 Epoch   2 Step:      300 Batch Loss:     4.783172 Tokens per Sec:    13252, Lr: 0.000300\n",
            "2019-12-31 06:53:14,400 Epoch   2 Step:      400 Batch Loss:     4.720972 Tokens per Sec:    13727, Lr: 0.000300\n",
            "2019-12-31 06:53:29,098 Epoch   2 Step:      500 Batch Loss:     4.665186 Tokens per Sec:    13964, Lr: 0.000300\n",
            "2019-12-31 06:53:34,775 Epoch   2: total training loss 1238.50\n",
            "2019-12-31 06:53:34,775 EPOCH 3\n",
            "2019-12-31 06:53:43,729 Epoch   3 Step:      600 Batch Loss:     4.156670 Tokens per Sec:    13992, Lr: 0.000300\n",
            "2019-12-31 06:53:58,306 Epoch   3 Step:      700 Batch Loss:     4.177386 Tokens per Sec:    13452, Lr: 0.000300\n",
            "2019-12-31 06:54:12,836 Epoch   3 Step:      800 Batch Loss:     3.818682 Tokens per Sec:    14428, Lr: 0.000300\n",
            "2019-12-31 06:54:14,015 Epoch   3: total training loss 1105.01\n",
            "2019-12-31 06:54:14,015 EPOCH 4\n",
            "2019-12-31 06:54:27,480 Epoch   4 Step:      900 Batch Loss:     3.945844 Tokens per Sec:    13672, Lr: 0.000300\n",
            "2019-12-31 06:54:42,134 Epoch   4 Step:     1000 Batch Loss:     3.845206 Tokens per Sec:    14279, Lr: 0.000300\n",
            "2019-12-31 06:55:15,730 Hooray! New best validation result [ppl]!\n",
            "2019-12-31 06:55:15,730 Saving new checkpoint.\n",
            "2019-12-31 06:55:16,062 Example #0\n",
            "2019-12-31 06:55:16,063 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 06:55:16,063 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 06:55:16,063 \tHypothesis: ( 1 Jọn 1 : 1 ) Ọ sa dianẹ avwanre vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo ẹguọnọ rẹ avwanre vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo ẹguọnọ rẹ Jihova .\n",
            "2019-12-31 06:55:16,063 Example #1\n",
            "2019-12-31 06:55:16,063 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 06:55:16,063 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 06:55:16,063 \tHypothesis: Ọ da ta : “ Ọ da ta : “ O vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo ẹguọnọ rẹ Ọghẹnẹ .\n",
            "2019-12-31 06:55:16,063 Example #2\n",
            "2019-12-31 06:55:16,063 \tSource:     But freedom from what ?\n",
            "2019-12-31 06:55:16,063 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 06:55:16,063 \tHypothesis: Die yen avwanre vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo ?\n",
            "2019-12-31 06:55:16,063 Example #3\n",
            "2019-12-31 06:55:16,063 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 06:55:16,063 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 06:55:16,063 \tHypothesis: Ọ da dianẹ avwanre vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo vwo ẹguọnọ rẹ Jihova .\n",
            "2019-12-31 06:55:16,063 Validation result (greedy) at epoch   4, step     1000: bleu:   0.44, loss: 78227.9688, ppl:  42.1249, duration: 33.9293s\n",
            "2019-12-31 06:55:27,124 Epoch   4: total training loss 1015.77\n",
            "2019-12-31 06:55:27,124 EPOCH 5\n",
            "2019-12-31 06:55:30,772 Epoch   5 Step:     1100 Batch Loss:     3.604009 Tokens per Sec:    12736, Lr: 0.000300\n",
            "2019-12-31 06:55:45,407 Epoch   5 Step:     1200 Batch Loss:     3.513468 Tokens per Sec:    13516, Lr: 0.000300\n",
            "2019-12-31 06:56:00,296 Epoch   5 Step:     1300 Batch Loss:     3.601948 Tokens per Sec:    14033, Lr: 0.000300\n",
            "2019-12-31 06:56:07,024 Epoch   5: total training loss 969.24\n",
            "2019-12-31 06:56:07,025 EPOCH 6\n",
            "2019-12-31 06:56:15,062 Epoch   6 Step:     1400 Batch Loss:     3.351712 Tokens per Sec:    13493, Lr: 0.000300\n",
            "2019-12-31 06:56:29,594 Epoch   6 Step:     1500 Batch Loss:     3.376288 Tokens per Sec:    13770, Lr: 0.000300\n",
            "2019-12-31 06:56:44,362 Epoch   6 Step:     1600 Batch Loss:     3.233148 Tokens per Sec:    13988, Lr: 0.000300\n",
            "2019-12-31 06:56:46,545 Epoch   6: total training loss 921.14\n",
            "2019-12-31 06:56:46,545 EPOCH 7\n",
            "2019-12-31 06:56:59,105 Epoch   7 Step:     1700 Batch Loss:     3.105795 Tokens per Sec:    13838, Lr: 0.000300\n",
            "2019-12-31 06:57:13,752 Epoch   7 Step:     1800 Batch Loss:     3.416011 Tokens per Sec:    13842, Lr: 0.000300\n",
            "2019-12-31 06:57:26,103 Epoch   7: total training loss 884.22\n",
            "2019-12-31 06:57:26,103 EPOCH 8\n",
            "2019-12-31 06:57:28,276 Epoch   8 Step:     1900 Batch Loss:     3.263082 Tokens per Sec:    12921, Lr: 0.000300\n",
            "2019-12-31 06:57:42,904 Epoch   8 Step:     2000 Batch Loss:     3.043142 Tokens per Sec:    14111, Lr: 0.000300\n",
            "2019-12-31 06:58:16,547 Hooray! New best validation result [ppl]!\n",
            "2019-12-31 06:58:16,547 Saving new checkpoint.\n",
            "2019-12-31 06:58:16,873 Example #0\n",
            "2019-12-31 06:58:16,873 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 06:58:16,873 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 06:58:16,873 \tHypothesis: Ọ da dianẹ a sa mrẹ ukẹcha rẹ ihwo efa , ọ je sa nẹrhẹ a mrẹ ukẹcha rẹ avwanre .\n",
            "2019-12-31 06:58:16,873 Example #1\n",
            "2019-12-31 06:58:16,873 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 06:58:16,874 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 06:58:16,874 \tHypothesis: Ọ da dianẹ ọ dia ọkobaro vwẹ ukpe rẹ ẹkuotọ rẹ ẹkuotọ rẹ Izrẹl .\n",
            "2019-12-31 06:58:16,874 Example #2\n",
            "2019-12-31 06:58:16,874 \tSource:     But freedom from what ?\n",
            "2019-12-31 06:58:16,874 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 06:58:16,874 \tHypothesis: Ẹkẹvuọvo , die yen avwanre vwo ?\n",
            "2019-12-31 06:58:16,874 Example #3\n",
            "2019-12-31 06:58:16,874 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 06:58:16,874 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 06:58:16,874 \tHypothesis: Ọ da dianẹ wọ sa mrẹ ukẹcha kẹ ihwo efa .\n",
            "2019-12-31 06:58:16,874 Validation result (greedy) at epoch   8, step     2000: bleu:   2.83, loss: 67111.7422, ppl:  24.7566, duration: 33.9699s\n",
            "2019-12-31 06:58:31,599 Epoch   8 Step:     2100 Batch Loss:     2.704267 Tokens per Sec:    13466, Lr: 0.000300\n",
            "2019-12-31 06:58:39,953 Epoch   8: total training loss 853.30\n",
            "2019-12-31 06:58:39,953 EPOCH 9\n",
            "2019-12-31 06:58:46,327 Epoch   9 Step:     2200 Batch Loss:     3.021497 Tokens per Sec:    13264, Lr: 0.000300\n",
            "2019-12-31 06:59:01,190 Epoch   9 Step:     2300 Batch Loss:     3.177639 Tokens per Sec:    13941, Lr: 0.000300\n",
            "2019-12-31 06:59:15,820 Epoch   9 Step:     2400 Batch Loss:     2.865735 Tokens per Sec:    13868, Lr: 0.000300\n",
            "2019-12-31 06:59:19,448 Epoch   9: total training loss 805.14\n",
            "2019-12-31 06:59:19,448 EPOCH 10\n",
            "2019-12-31 06:59:30,557 Epoch  10 Step:     2500 Batch Loss:     2.494851 Tokens per Sec:    13864, Lr: 0.000300\n",
            "2019-12-31 06:59:45,149 Epoch  10 Step:     2600 Batch Loss:     3.033088 Tokens per Sec:    13772, Lr: 0.000300\n",
            "2019-12-31 06:59:59,161 Epoch  10: total training loss 779.46\n",
            "2019-12-31 06:59:59,161 EPOCH 11\n",
            "2019-12-31 06:59:59,935 Epoch  11 Step:     2700 Batch Loss:     3.020592 Tokens per Sec:    12892, Lr: 0.000300\n",
            "2019-12-31 07:00:14,664 Epoch  11 Step:     2800 Batch Loss:     2.511718 Tokens per Sec:    13639, Lr: 0.000300\n",
            "2019-12-31 07:00:29,442 Epoch  11 Step:     2900 Batch Loss:     2.980664 Tokens per Sec:    13900, Lr: 0.000300\n",
            "2019-12-31 07:00:38,842 Epoch  11: total training loss 750.48\n",
            "2019-12-31 07:00:38,843 EPOCH 12\n",
            "2019-12-31 07:00:44,065 Epoch  12 Step:     3000 Batch Loss:     2.592532 Tokens per Sec:    13316, Lr: 0.000300\n",
            "2019-12-31 07:01:17,758 Hooray! New best validation result [ppl]!\n",
            "2019-12-31 07:01:17,758 Saving new checkpoint.\n",
            "2019-12-31 07:01:18,076 Example #0\n",
            "2019-12-31 07:01:18,076 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 07:01:18,077 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 07:01:18,077 \tHypothesis: Ihwo buebun vwo imuẹro kpahen obo re sa nẹrhẹ ayen riẹn kpahen obo re sa vwọ chọn ayen uko .\n",
            "2019-12-31 07:01:18,077 Example #1\n",
            "2019-12-31 07:01:18,077 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 07:01:18,077 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:01:18,077 \tHypothesis: ( 1 Kọr . 3 : 1 - 14 ) Ẹkẹvuọvo , o de ji te omarẹ ẹgbukpe ujorin buebun .\n",
            "2019-12-31 07:01:18,077 Example #2\n",
            "2019-12-31 07:01:18,077 \tSource:     But freedom from what ?\n",
            "2019-12-31 07:01:18,077 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 07:01:18,077 \tHypothesis: Ẹkẹvuọvo , die yen omaevwokpotọ ?\n",
            "2019-12-31 07:01:18,077 Example #3\n",
            "2019-12-31 07:01:18,077 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 07:01:18,077 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 07:01:18,077 \tHypothesis: Ọ da dianẹ wọ dia ọtiọyen , kẹ wẹ omavwerhovwẹn .\n",
            "2019-12-31 07:01:18,077 Validation result (greedy) at epoch  12, step     3000: bleu:   4.77, loss: 61367.6328, ppl:  18.8107, duration: 34.0124s\n",
            "2019-12-31 07:01:32,969 Epoch  12 Step:     3100 Batch Loss:     2.875560 Tokens per Sec:    13999, Lr: 0.000300\n",
            "2019-12-31 07:01:47,587 Epoch  12 Step:     3200 Batch Loss:     2.753931 Tokens per Sec:    13652, Lr: 0.000300\n",
            "2019-12-31 07:01:52,566 Epoch  12: total training loss 728.14\n",
            "2019-12-31 07:01:52,566 EPOCH 13\n",
            "2019-12-31 07:02:02,343 Epoch  13 Step:     3300 Batch Loss:     2.643054 Tokens per Sec:    13837, Lr: 0.000300\n",
            "2019-12-31 07:02:17,160 Epoch  13 Step:     3400 Batch Loss:     2.899310 Tokens per Sec:    13726, Lr: 0.000300\n",
            "2019-12-31 07:02:31,803 Epoch  13 Step:     3500 Batch Loss:     2.638120 Tokens per Sec:    13650, Lr: 0.000300\n",
            "2019-12-31 07:02:32,430 Epoch  13: total training loss 705.72\n",
            "2019-12-31 07:02:32,430 EPOCH 14\n",
            "2019-12-31 07:02:46,304 Epoch  14 Step:     3600 Batch Loss:     2.360273 Tokens per Sec:    13483, Lr: 0.000300\n",
            "2019-12-31 07:03:01,038 Epoch  14 Step:     3700 Batch Loss:     2.435451 Tokens per Sec:    13983, Lr: 0.000300\n",
            "2019-12-31 07:03:12,151 Epoch  14: total training loss 680.83\n",
            "2019-12-31 07:03:12,151 EPOCH 15\n",
            "2019-12-31 07:03:15,898 Epoch  15 Step:     3800 Batch Loss:     2.740217 Tokens per Sec:    13625, Lr: 0.000300\n",
            "2019-12-31 07:03:30,757 Epoch  15 Step:     3900 Batch Loss:     2.628906 Tokens per Sec:    13697, Lr: 0.000300\n",
            "2019-12-31 07:03:45,467 Epoch  15 Step:     4000 Batch Loss:     2.365105 Tokens per Sec:    13743, Lr: 0.000300\n",
            "2019-12-31 07:04:19,148 Hooray! New best validation result [ppl]!\n",
            "2019-12-31 07:04:19,148 Saving new checkpoint.\n",
            "2019-12-31 07:04:19,506 Example #0\n",
            "2019-12-31 07:04:19,506 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 07:04:19,506 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 07:04:19,506 \tHypothesis: Ihwo buebun vwo omavwerhovwẹn kpahen obo re sa nẹrhẹ ayen se vwo ẹwẹn obrorhiẹn rẹ avwanre , ji vwo ẹwẹn obrorhiẹn rẹ avwanre .\n",
            "2019-12-31 07:04:19,506 Example #1\n",
            "2019-12-31 07:04:19,506 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 07:04:19,506 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:04:19,506 \tHypothesis: Nonẹna , ọ dia ọkobaro vwẹ Bẹtẹl .\n",
            "2019-12-31 07:04:19,507 Example #2\n",
            "2019-12-31 07:04:19,507 \tSource:     But freedom from what ?\n",
            "2019-12-31 07:04:19,507 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 07:04:19,507 \tHypothesis: Ẹkẹvuọvo , die yen egbomọphẹ rẹ avwanre ?\n",
            "2019-12-31 07:04:19,507 Example #3\n",
            "2019-12-31 07:04:19,507 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 07:04:19,507 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 07:04:19,507 \tHypothesis: Ọ dia ọtiọyen , ọ dia ọtiọyen wo vwo ukoko wẹn - a .\n",
            "2019-12-31 07:04:19,507 Validation result (greedy) at epoch  15, step     4000: bleu:   6.49, loss: 57640.0391, ppl:  15.7396, duration: 34.0399s\n",
            "2019-12-31 07:04:26,260 Epoch  15: total training loss 664.41\n",
            "2019-12-31 07:04:26,260 EPOCH 16\n",
            "2019-12-31 07:04:34,254 Epoch  16 Step:     4100 Batch Loss:     2.644593 Tokens per Sec:    13384, Lr: 0.000300\n",
            "2019-12-31 07:04:49,044 Epoch  16 Step:     4200 Batch Loss:     2.394016 Tokens per Sec:    13601, Lr: 0.000300\n",
            "2019-12-31 07:05:03,978 Epoch  16 Step:     4300 Batch Loss:     2.542410 Tokens per Sec:    13681, Lr: 0.000300\n",
            "2019-12-31 07:05:06,484 Epoch  16: total training loss 644.54\n",
            "2019-12-31 07:05:06,484 EPOCH 17\n",
            "2019-12-31 07:05:18,808 Epoch  17 Step:     4400 Batch Loss:     2.453203 Tokens per Sec:    13454, Lr: 0.000300\n",
            "2019-12-31 07:05:33,650 Epoch  17 Step:     4500 Batch Loss:     2.475983 Tokens per Sec:    13630, Lr: 0.000300\n",
            "2019-12-31 07:05:46,454 Epoch  17: total training loss 623.35\n",
            "2019-12-31 07:05:46,454 EPOCH 18\n",
            "2019-12-31 07:05:48,383 Epoch  18 Step:     4600 Batch Loss:     2.462085 Tokens per Sec:    12807, Lr: 0.000300\n",
            "2019-12-31 07:06:03,125 Epoch  18 Step:     4700 Batch Loss:     2.184745 Tokens per Sec:    13834, Lr: 0.000300\n",
            "2019-12-31 07:06:17,981 Epoch  18 Step:     4800 Batch Loss:     2.145816 Tokens per Sec:    13542, Lr: 0.000300\n",
            "2019-12-31 07:06:26,361 Epoch  18: total training loss 606.98\n",
            "2019-12-31 07:06:26,361 EPOCH 19\n",
            "2019-12-31 07:06:32,781 Epoch  19 Step:     4900 Batch Loss:     2.451422 Tokens per Sec:    13949, Lr: 0.000300\n",
            "2019-12-31 07:06:47,471 Epoch  19 Step:     5000 Batch Loss:     1.994083 Tokens per Sec:    13568, Lr: 0.000300\n",
            "2019-12-31 07:07:21,203 Hooray! New best validation result [ppl]!\n",
            "2019-12-31 07:07:21,203 Saving new checkpoint.\n",
            "2019-12-31 07:07:21,549 Example #0\n",
            "2019-12-31 07:07:21,549 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 07:07:21,549 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 07:07:21,549 \tHypothesis: Ọnana yen nẹrhẹ ayen se vwo ẹruọ rẹ obo ra guọnọre , rere ayen se vwo ruiruo .\n",
            "2019-12-31 07:07:21,549 Example #1\n",
            "2019-12-31 07:07:21,549 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 07:07:21,549 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:07:21,549 \tHypothesis: Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:07:21,549 Example #2\n",
            "2019-12-31 07:07:21,549 \tSource:     But freedom from what ?\n",
            "2019-12-31 07:07:21,549 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 07:07:21,549 \tHypothesis: Ẹkẹvuọvo , die yen egbomọphẹ ?\n",
            "2019-12-31 07:07:21,549 Example #3\n",
            "2019-12-31 07:07:21,550 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 07:07:21,550 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 07:07:21,550 \tHypothesis: Wọ guọnọ ukoko wẹn - a .\n",
            "2019-12-31 07:07:21,550 Validation result (greedy) at epoch  19, step     5000: bleu:   8.95, loss: 55712.8125, ppl:  14.3540, duration: 34.0789s\n",
            "2019-12-31 07:07:36,388 Epoch  19 Step:     5100 Batch Loss:     2.392033 Tokens per Sec:    13726, Lr: 0.000300\n",
            "2019-12-31 07:07:40,376 Epoch  19: total training loss 590.16\n",
            "2019-12-31 07:07:40,377 EPOCH 20\n",
            "2019-12-31 07:07:51,256 Epoch  20 Step:     5200 Batch Loss:     2.342585 Tokens per Sec:    13504, Lr: 0.000300\n",
            "2019-12-31 07:08:05,903 Epoch  20 Step:     5300 Batch Loss:     2.271941 Tokens per Sec:    13812, Lr: 0.000300\n",
            "2019-12-31 07:08:20,102 Epoch  20: total training loss 573.37\n",
            "2019-12-31 07:08:20,102 EPOCH 21\n",
            "2019-12-31 07:08:20,609 Epoch  21 Step:     5400 Batch Loss:     1.367846 Tokens per Sec:    12282, Lr: 0.000300\n",
            "2019-12-31 07:08:35,396 Epoch  21 Step:     5500 Batch Loss:     1.976137 Tokens per Sec:    13481, Lr: 0.000300\n",
            "2019-12-31 07:08:50,187 Epoch  21 Step:     5600 Batch Loss:     1.547307 Tokens per Sec:    13922, Lr: 0.000300\n",
            "2019-12-31 07:09:00,165 Epoch  21: total training loss 560.73\n",
            "2019-12-31 07:09:00,165 EPOCH 22\n",
            "2019-12-31 07:09:04,834 Epoch  22 Step:     5700 Batch Loss:     2.337951 Tokens per Sec:    13558, Lr: 0.000300\n",
            "2019-12-31 07:09:19,484 Epoch  22 Step:     5800 Batch Loss:     2.452579 Tokens per Sec:    13835, Lr: 0.000300\n",
            "2019-12-31 07:09:34,229 Epoch  22 Step:     5900 Batch Loss:     1.569645 Tokens per Sec:    13572, Lr: 0.000300\n",
            "2019-12-31 07:09:39,934 Epoch  22: total training loss 549.55\n",
            "2019-12-31 07:09:39,935 EPOCH 23\n",
            "2019-12-31 07:09:48,881 Epoch  23 Step:     6000 Batch Loss:     2.105916 Tokens per Sec:    13623, Lr: 0.000300\n",
            "2019-12-31 07:10:22,655 Hooray! New best validation result [ppl]!\n",
            "2019-12-31 07:10:22,655 Saving new checkpoint.\n",
            "2019-12-31 07:10:23,013 Example #0\n",
            "2019-12-31 07:10:23,013 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 07:10:23,013 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 07:10:23,013 \tHypothesis: Ọnana yen nẹrhẹ ayen vwo oniso rẹ obo re se vwo ru obo re se vwo ru obo re chọre .\n",
            "2019-12-31 07:10:23,013 Example #1\n",
            "2019-12-31 07:10:23,013 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 07:10:23,013 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:10:23,013 \tHypothesis: Nonẹna , o ji vwo ighwoghwota re ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:10:23,014 Example #2\n",
            "2019-12-31 07:10:23,014 \tSource:     But freedom from what ?\n",
            "2019-12-31 07:10:23,014 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 07:10:23,014 \tHypothesis: Ẹkẹvuọvo , egbomọphẹ vọ yen e vwo ruiruo ?\n",
            "2019-12-31 07:10:23,014 Example #3\n",
            "2019-12-31 07:10:23,014 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 07:10:23,014 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 07:10:23,014 \tHypothesis: Wo jẹ ukoko na rhọnvwe nẹ ukoko wẹn rhe - e .\n",
            "2019-12-31 07:10:23,014 Validation result (greedy) at epoch  23, step     6000: bleu:  10.16, loss: 54999.3242, ppl:  13.8725, duration: 34.1329s\n",
            "2019-12-31 07:10:37,987 Epoch  23 Step:     6100 Batch Loss:     1.944094 Tokens per Sec:    13701, Lr: 0.000300\n",
            "2019-12-31 07:10:52,850 Epoch  23 Step:     6200 Batch Loss:     2.100627 Tokens per Sec:    13753, Lr: 0.000300\n",
            "2019-12-31 07:10:53,985 Epoch  23: total training loss 531.26\n",
            "2019-12-31 07:10:53,985 EPOCH 24\n",
            "2019-12-31 07:11:07,617 Epoch  24 Step:     6300 Batch Loss:     1.290862 Tokens per Sec:    13407, Lr: 0.000300\n",
            "2019-12-31 07:11:22,517 Epoch  24 Step:     6400 Batch Loss:     1.348583 Tokens per Sec:    13468, Lr: 0.000300\n",
            "2019-12-31 07:11:34,352 Epoch  24: total training loss 525.52\n",
            "2019-12-31 07:11:34,352 EPOCH 25\n",
            "2019-12-31 07:11:37,391 Epoch  25 Step:     6500 Batch Loss:     1.742582 Tokens per Sec:    12984, Lr: 0.000300\n",
            "2019-12-31 07:11:52,375 Epoch  25 Step:     6600 Batch Loss:     2.128956 Tokens per Sec:    13696, Lr: 0.000300\n",
            "2019-12-31 07:12:07,241 Epoch  25 Step:     6700 Batch Loss:     1.881225 Tokens per Sec:    13659, Lr: 0.000300\n",
            "2019-12-31 07:12:14,609 Epoch  25: total training loss 507.45\n",
            "2019-12-31 07:12:14,609 EPOCH 26\n",
            "2019-12-31 07:12:22,095 Epoch  26 Step:     6800 Batch Loss:     1.145603 Tokens per Sec:    13430, Lr: 0.000300\n",
            "2019-12-31 07:12:36,854 Epoch  26 Step:     6900 Batch Loss:     1.602573 Tokens per Sec:    13353, Lr: 0.000300\n",
            "2019-12-31 07:12:51,686 Epoch  26 Step:     7000 Batch Loss:     1.932218 Tokens per Sec:    13621, Lr: 0.000300\n",
            "2019-12-31 07:13:25,459 Hooray! New best validation result [ppl]!\n",
            "2019-12-31 07:13:25,460 Saving new checkpoint.\n",
            "2019-12-31 07:13:25,807 Example #0\n",
            "2019-12-31 07:13:25,807 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 07:13:25,807 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 07:13:25,807 \tHypothesis: Ọnana nẹrhẹ ayen se vwo ẹwẹn rẹ aghwanre , je davwẹngba vwo nene odjekẹ rẹ Baibol na .\n",
            "2019-12-31 07:13:25,808 Example #1\n",
            "2019-12-31 07:13:25,808 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 07:13:25,808 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:13:25,808 \tHypothesis: Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:13:25,808 Example #2\n",
            "2019-12-31 07:13:25,808 \tSource:     But freedom from what ?\n",
            "2019-12-31 07:13:25,808 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 07:13:25,808 \tHypothesis: Ẹkẹvuọvo , die yen egbomọphẹ ?\n",
            "2019-12-31 07:13:25,808 Example #3\n",
            "2019-12-31 07:13:25,808 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 07:13:25,808 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 07:13:25,808 \tHypothesis: Wọ rha guọnọ ukoko wẹn - o .\n",
            "2019-12-31 07:13:25,808 Validation result (greedy) at epoch  26, step     7000: bleu:  11.26, loss: 54282.3398, ppl:  13.4050, duration: 34.1218s\n",
            "2019-12-31 07:13:29,124 Epoch  26: total training loss 499.46\n",
            "2019-12-31 07:13:29,125 EPOCH 27\n",
            "2019-12-31 07:13:40,699 Epoch  27 Step:     7100 Batch Loss:     1.918137 Tokens per Sec:    13304, Lr: 0.000300\n",
            "2019-12-31 07:13:55,680 Epoch  27 Step:     7200 Batch Loss:     1.854988 Tokens per Sec:    13690, Lr: 0.000300\n",
            "2019-12-31 07:14:09,467 Epoch  27: total training loss 488.60\n",
            "2019-12-31 07:14:09,467 EPOCH 28\n",
            "2019-12-31 07:14:10,587 Epoch  28 Step:     7300 Batch Loss:     1.792533 Tokens per Sec:    14309, Lr: 0.000300\n",
            "2019-12-31 07:14:25,438 Epoch  28 Step:     7400 Batch Loss:     1.934778 Tokens per Sec:    13459, Lr: 0.000300\n",
            "2019-12-31 07:14:40,228 Epoch  28 Step:     7500 Batch Loss:     1.964069 Tokens per Sec:    13553, Lr: 0.000300\n",
            "2019-12-31 07:14:49,735 Epoch  28: total training loss 476.73\n",
            "2019-12-31 07:14:49,735 EPOCH 29\n",
            "2019-12-31 07:14:55,175 Epoch  29 Step:     7600 Batch Loss:     1.516087 Tokens per Sec:    13747, Lr: 0.000300\n",
            "2019-12-31 07:15:09,971 Epoch  29 Step:     7700 Batch Loss:     1.268625 Tokens per Sec:    13611, Lr: 0.000300\n",
            "2019-12-31 07:15:24,918 Epoch  29 Step:     7800 Batch Loss:     2.163529 Tokens per Sec:    13709, Lr: 0.000300\n",
            "2019-12-31 07:15:29,970 Epoch  29: total training loss 465.72\n",
            "2019-12-31 07:15:29,970 EPOCH 30\n",
            "2019-12-31 07:15:39,892 Epoch  30 Step:     7900 Batch Loss:     1.518623 Tokens per Sec:    13499, Lr: 0.000300\n",
            "2019-12-31 07:15:54,837 Epoch  30 Step:     8000 Batch Loss:     1.001237 Tokens per Sec:    13340, Lr: 0.000300\n",
            "2019-12-31 07:16:28,566 Example #0\n",
            "2019-12-31 07:16:28,566 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 07:16:28,566 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 07:16:28,566 \tHypothesis: Ọnana yen nẹrhẹ e se vwo ẹwẹn rẹ doe , je nabọ muegbe rẹ iroro rẹ avwanre vwo nene .\n",
            "2019-12-31 07:16:28,566 Example #1\n",
            "2019-12-31 07:16:28,566 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 07:16:28,566 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:16:28,566 \tHypothesis: Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:16:28,567 Example #2\n",
            "2019-12-31 07:16:28,567 \tSource:     But freedom from what ?\n",
            "2019-12-31 07:16:28,567 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 07:16:28,567 \tHypothesis: Ẹkẹvuọvo , egbomọphẹ vọ yen a vwọ kẹ egbomọphẹ ?\n",
            "2019-12-31 07:16:28,567 Example #3\n",
            "2019-12-31 07:16:28,567 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 07:16:28,567 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 07:16:28,567 \tHypothesis: Wọ riẹnre nẹ ukoko wẹn vẹ ukoko wẹn ọvo yen wọ hepha na - a .\n",
            "2019-12-31 07:16:28,567 Validation result (greedy) at epoch  30, step     8000: bleu:  12.18, loss: 54547.9141, ppl:  13.5763, duration: 33.7300s\n",
            "2019-12-31 07:16:43,485 Epoch  30 Step:     8100 Batch Loss:     1.709848 Tokens per Sec:    13856, Lr: 0.000300\n",
            "2019-12-31 07:16:43,933 Epoch  30: total training loss 454.59\n",
            "2019-12-31 07:16:43,933 EPOCH 31\n",
            "2019-12-31 07:16:58,426 Epoch  31 Step:     8200 Batch Loss:     1.812082 Tokens per Sec:    13612, Lr: 0.000300\n",
            "2019-12-31 07:17:13,237 Epoch  31 Step:     8300 Batch Loss:     1.758263 Tokens per Sec:    13320, Lr: 0.000300\n",
            "2019-12-31 07:17:24,266 Epoch  31: total training loss 450.16\n",
            "2019-12-31 07:17:24,267 EPOCH 32\n",
            "2019-12-31 07:17:28,183 Epoch  32 Step:     8400 Batch Loss:     1.760935 Tokens per Sec:    13715, Lr: 0.000300\n",
            "2019-12-31 07:17:42,936 Epoch  32 Step:     8500 Batch Loss:     1.677956 Tokens per Sec:    13580, Lr: 0.000300\n",
            "2019-12-31 07:17:57,806 Epoch  32 Step:     8600 Batch Loss:     1.764362 Tokens per Sec:    13357, Lr: 0.000300\n",
            "2019-12-31 07:18:04,582 Epoch  32: total training loss 441.58\n",
            "2019-12-31 07:18:04,583 EPOCH 33\n",
            "2019-12-31 07:18:12,782 Epoch  33 Step:     8700 Batch Loss:     1.516737 Tokens per Sec:    13426, Lr: 0.000300\n",
            "2019-12-31 07:18:27,660 Epoch  33 Step:     8800 Batch Loss:     1.292231 Tokens per Sec:    13503, Lr: 0.000300\n",
            "2019-12-31 07:18:42,588 Epoch  33 Step:     8900 Batch Loss:     1.753464 Tokens per Sec:    13668, Lr: 0.000300\n",
            "2019-12-31 07:18:44,874 Epoch  33: total training loss 431.91\n",
            "2019-12-31 07:18:44,874 EPOCH 34\n",
            "2019-12-31 07:18:57,558 Epoch  34 Step:     9000 Batch Loss:     1.862409 Tokens per Sec:    13725, Lr: 0.000300\n",
            "2019-12-31 07:19:31,389 Example #0\n",
            "2019-12-31 07:19:31,389 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 07:19:31,390 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 07:19:31,390 \tHypothesis: Ọnana nẹrhẹ ayen se vwo oniso rẹ oborẹ ubiudu avwanre se vwo ruiruo wan .\n",
            "2019-12-31 07:19:31,390 Example #1\n",
            "2019-12-31 07:19:31,390 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 07:19:31,390 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:19:31,390 \tHypothesis: Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:19:31,390 Example #2\n",
            "2019-12-31 07:19:31,390 \tSource:     But freedom from what ?\n",
            "2019-12-31 07:19:31,390 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 07:19:31,390 \tHypothesis: Ẹkẹvuọvo , egbomọphẹ vọ yen a vwọ kẹ ?\n",
            "2019-12-31 07:19:31,390 Example #3\n",
            "2019-12-31 07:19:31,390 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 07:19:31,390 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 07:19:31,390 \tHypothesis: Wọ vwẹroso ukoko wẹn jovwo .\n",
            "2019-12-31 07:19:31,390 Validation result (greedy) at epoch  34, step     9000: bleu:  12.61, loss: 54701.2695, ppl:  13.6762, duration: 33.8314s\n",
            "2019-12-31 07:19:46,345 Epoch  34 Step:     9100 Batch Loss:     1.435172 Tokens per Sec:    13780, Lr: 0.000300\n",
            "2019-12-31 07:19:58,647 Epoch  34: total training loss 417.15\n",
            "2019-12-31 07:19:58,647 EPOCH 35\n",
            "2019-12-31 07:20:01,300 Epoch  35 Step:     9200 Batch Loss:     1.500320 Tokens per Sec:    14582, Lr: 0.000300\n",
            "2019-12-31 07:20:16,040 Epoch  35 Step:     9300 Batch Loss:     1.891159 Tokens per Sec:    13347, Lr: 0.000300\n",
            "2019-12-31 07:20:30,961 Epoch  35 Step:     9400 Batch Loss:     1.892118 Tokens per Sec:    13556, Lr: 0.000300\n",
            "2019-12-31 07:20:38,832 Epoch  35: total training loss 416.28\n",
            "2019-12-31 07:20:38,832 EPOCH 36\n",
            "2019-12-31 07:20:45,805 Epoch  36 Step:     9500 Batch Loss:     1.718353 Tokens per Sec:    13421, Lr: 0.000300\n",
            "2019-12-31 07:21:00,800 Epoch  36 Step:     9600 Batch Loss:     1.592865 Tokens per Sec:    13581, Lr: 0.000300\n",
            "2019-12-31 07:21:15,653 Epoch  36 Step:     9700 Batch Loss:     1.648238 Tokens per Sec:    13649, Lr: 0.000300\n",
            "2019-12-31 07:21:19,154 Epoch  36: total training loss 408.22\n",
            "2019-12-31 07:21:19,154 EPOCH 37\n",
            "2019-12-31 07:21:30,550 Epoch  37 Step:     9800 Batch Loss:     1.706467 Tokens per Sec:    13861, Lr: 0.000300\n",
            "2019-12-31 07:21:45,351 Epoch  37 Step:     9900 Batch Loss:     1.681366 Tokens per Sec:    13437, Lr: 0.000300\n",
            "2019-12-31 07:21:59,221 Epoch  37: total training loss 401.36\n",
            "2019-12-31 07:21:59,221 EPOCH 38\n",
            "2019-12-31 07:22:00,182 Epoch  38 Step:    10000 Batch Loss:     1.660343 Tokens per Sec:    13737, Lr: 0.000300\n",
            "2019-12-31 07:22:33,954 Example #0\n",
            "2019-12-31 07:22:33,955 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 07:22:33,955 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 07:22:33,955 \tHypothesis: Enana nẹrhẹ ayen se vwo ẹwẹn rẹ avwanre vwo muegbe rẹ iroro rẹ avwanre , ji nene odjekẹ rẹ Baibol na .\n",
            "2019-12-31 07:22:33,955 Example #1\n",
            "2019-12-31 07:22:33,955 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 07:22:33,955 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:22:33,955 \tHypothesis: Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:22:33,955 Example #2\n",
            "2019-12-31 07:22:33,955 \tSource:     But freedom from what ?\n",
            "2019-12-31 07:22:33,955 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 07:22:33,955 \tHypothesis: Ẹkẹvuọvo , egbomọphẹ vọ yen a vwọ kẹ ?\n",
            "2019-12-31 07:22:33,955 Example #3\n",
            "2019-12-31 07:22:33,955 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 07:22:33,955 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 07:22:33,955 \tHypothesis: Wọ rha riẹn nẹ ukoko wẹn vẹ ukoko ọfa - a .\n",
            "2019-12-31 07:22:33,955 Validation result (greedy) at epoch  38, step    10000: bleu:  12.95, loss: 55179.7461, ppl:  13.9927, duration: 33.7736s\n",
            "2019-12-31 07:22:48,834 Epoch  38 Step:    10100 Batch Loss:     1.684994 Tokens per Sec:    13447, Lr: 0.000300\n",
            "2019-12-31 07:23:03,751 Epoch  38 Step:    10200 Batch Loss:     1.395089 Tokens per Sec:    13622, Lr: 0.000300\n",
            "2019-12-31 07:23:13,392 Epoch  38: total training loss 395.41\n",
            "2019-12-31 07:23:13,392 EPOCH 39\n",
            "2019-12-31 07:23:18,659 Epoch  39 Step:    10300 Batch Loss:     1.058967 Tokens per Sec:    13694, Lr: 0.000300\n",
            "2019-12-31 07:23:33,611 Epoch  39 Step:    10400 Batch Loss:     1.043420 Tokens per Sec:    13491, Lr: 0.000300\n",
            "2019-12-31 07:23:48,601 Epoch  39 Step:    10500 Batch Loss:     1.775222 Tokens per Sec:    13584, Lr: 0.000300\n",
            "2019-12-31 07:23:53,704 Epoch  39: total training loss 383.54\n",
            "2019-12-31 07:23:53,704 EPOCH 40\n",
            "2019-12-31 07:24:03,512 Epoch  40 Step:    10600 Batch Loss:     1.408048 Tokens per Sec:    13430, Lr: 0.000300\n",
            "2019-12-31 07:24:18,440 Epoch  40 Step:    10700 Batch Loss:     1.531781 Tokens per Sec:    13340, Lr: 0.000300\n",
            "2019-12-31 07:24:33,265 Epoch  40 Step:    10800 Batch Loss:     1.343017 Tokens per Sec:    13682, Lr: 0.000300\n",
            "2019-12-31 07:24:34,159 Epoch  40: total training loss 382.81\n",
            "2019-12-31 07:24:34,159 EPOCH 41\n",
            "2019-12-31 07:24:48,186 Epoch  41 Step:    10900 Batch Loss:     0.993885 Tokens per Sec:    13652, Lr: 0.000300\n",
            "2019-12-31 07:25:03,010 Epoch  41 Step:    11000 Batch Loss:     0.823253 Tokens per Sec:    13285, Lr: 0.000300\n",
            "2019-12-31 07:25:36,790 Example #0\n",
            "2019-12-31 07:25:36,790 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 07:25:36,791 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 07:25:36,791 \tHypothesis: Ọnana nẹrhẹ ayen se vwo oniso rẹ oborẹ ubiudu avwanre se vwo ruiruo .\n",
            "2019-12-31 07:25:36,791 Example #1\n",
            "2019-12-31 07:25:36,791 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 07:25:36,791 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:25:36,791 \tHypothesis: Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:25:36,791 Example #2\n",
            "2019-12-31 07:25:36,791 \tSource:     But freedom from what ?\n",
            "2019-12-31 07:25:36,791 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 07:25:36,791 \tHypothesis: Ẹkẹvuọvo , egbomọphẹ vọ yen o vwo ruo ?\n",
            "2019-12-31 07:25:36,791 Example #3\n",
            "2019-12-31 07:25:36,791 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 07:25:36,792 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 07:25:36,792 \tHypothesis: Wọ vwẹroso ukoko wẹn jovwo re .\n",
            "2019-12-31 07:25:36,792 Validation result (greedy) at epoch  41, step    11000: bleu:  13.17, loss: 55769.6367, ppl:  14.3930, duration: 33.7811s\n",
            "2019-12-31 07:25:48,155 Epoch  41: total training loss 373.73\n",
            "2019-12-31 07:25:48,155 EPOCH 42\n",
            "2019-12-31 07:25:51,671 Epoch  42 Step:    11100 Batch Loss:     1.485085 Tokens per Sec:    14135, Lr: 0.000300\n",
            "2019-12-31 07:26:06,501 Epoch  42 Step:    11200 Batch Loss:     1.645910 Tokens per Sec:    13656, Lr: 0.000300\n",
            "2019-12-31 07:26:21,306 Epoch  42 Step:    11300 Batch Loss:     1.245075 Tokens per Sec:    13573, Lr: 0.000300\n",
            "2019-12-31 07:26:28,026 Epoch  42: total training loss 363.26\n",
            "2019-12-31 07:26:28,026 EPOCH 43\n",
            "2019-12-31 07:26:36,205 Epoch  43 Step:    11400 Batch Loss:     1.492405 Tokens per Sec:    13992, Lr: 0.000300\n",
            "2019-12-31 07:26:51,050 Epoch  43 Step:    11500 Batch Loss:     1.488233 Tokens per Sec:    13812, Lr: 0.000300\n",
            "2019-12-31 07:27:05,757 Epoch  43 Step:    11600 Batch Loss:     1.194742 Tokens per Sec:    13588, Lr: 0.000300\n",
            "2019-12-31 07:27:07,815 Epoch  43: total training loss 358.63\n",
            "2019-12-31 07:27:07,815 EPOCH 44\n",
            "2019-12-31 07:27:20,483 Epoch  44 Step:    11700 Batch Loss:     0.656036 Tokens per Sec:    13722, Lr: 0.000300\n",
            "2019-12-31 07:27:35,368 Epoch  44 Step:    11800 Batch Loss:     1.543444 Tokens per Sec:    13849, Lr: 0.000300\n",
            "2019-12-31 07:27:47,541 Epoch  44: total training loss 354.16\n",
            "2019-12-31 07:27:47,541 EPOCH 45\n",
            "2019-12-31 07:27:50,119 Epoch  45 Step:    11900 Batch Loss:     1.403820 Tokens per Sec:    13083, Lr: 0.000300\n",
            "2019-12-31 07:28:04,991 Epoch  45 Step:    12000 Batch Loss:     1.411539 Tokens per Sec:    13758, Lr: 0.000300\n",
            "2019-12-31 07:28:38,668 Example #0\n",
            "2019-12-31 07:28:38,668 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 07:28:38,668 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 07:28:38,668 \tHypothesis: Ẹnwan nana nẹrhẹ ihwo buebun se muegbe rẹ ayen vwo muegbe rẹ iroro rẹ avwanre , je reyọ ayen vwo ruiruo .\n",
            "2019-12-31 07:28:38,668 Example #1\n",
            "2019-12-31 07:28:38,668 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 07:28:38,668 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:28:38,668 \tHypothesis: Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:28:38,668 Example #2\n",
            "2019-12-31 07:28:38,669 \tSource:     But freedom from what ?\n",
            "2019-12-31 07:28:38,669 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 07:28:38,669 \tHypothesis: Kẹ egbomọphẹ vwo ?\n",
            "2019-12-31 07:28:38,669 Example #3\n",
            "2019-12-31 07:28:38,669 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 07:28:38,669 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 07:28:38,669 \tHypothesis: Wọ riẹnre nẹ ukoko wẹn yovwin nọ wẹ - ẹ .\n",
            "2019-12-31 07:28:38,669 Validation result (greedy) at epoch  45, step    12000: bleu:  13.49, loss: 56564.6719, ppl:  14.9507, duration: 33.6780s\n",
            "2019-12-31 07:28:53,376 Epoch  45 Step:    12100 Batch Loss:     1.306053 Tokens per Sec:    13489, Lr: 0.000300\n",
            "2019-12-31 07:29:01,360 Epoch  45: total training loss 351.32\n",
            "2019-12-31 07:29:01,360 EPOCH 46\n",
            "2019-12-31 07:29:08,301 Epoch  46 Step:    12200 Batch Loss:     1.005195 Tokens per Sec:    14128, Lr: 0.000300\n",
            "2019-12-31 07:29:23,083 Epoch  46 Step:    12300 Batch Loss:     1.091875 Tokens per Sec:    13500, Lr: 0.000300\n",
            "2019-12-31 07:29:37,956 Epoch  46 Step:    12400 Batch Loss:     0.687656 Tokens per Sec:    13488, Lr: 0.000300\n",
            "2019-12-31 07:29:41,380 Epoch  46: total training loss 343.18\n",
            "2019-12-31 07:29:41,380 EPOCH 47\n",
            "2019-12-31 07:29:52,949 Epoch  47 Step:    12500 Batch Loss:     1.045718 Tokens per Sec:    13965, Lr: 0.000300\n",
            "2019-12-31 07:30:07,637 Epoch  47 Step:    12600 Batch Loss:     1.468696 Tokens per Sec:    13524, Lr: 0.000300\n",
            "2019-12-31 07:30:21,283 Epoch  47: total training loss 338.36\n",
            "2019-12-31 07:30:21,283 EPOCH 48\n",
            "2019-12-31 07:30:22,330 Epoch  48 Step:    12700 Batch Loss:     1.017751 Tokens per Sec:    12864, Lr: 0.000300\n",
            "2019-12-31 07:30:37,170 Epoch  48 Step:    12800 Batch Loss:     1.525858 Tokens per Sec:    13621, Lr: 0.000300\n",
            "2019-12-31 07:30:51,918 Epoch  48 Step:    12900 Batch Loss:     1.321356 Tokens per Sec:    13467, Lr: 0.000300\n",
            "2019-12-31 07:31:01,576 Epoch  48: total training loss 335.82\n",
            "2019-12-31 07:31:01,577 EPOCH 49\n",
            "2019-12-31 07:31:06,772 Epoch  49 Step:    13000 Batch Loss:     1.303249 Tokens per Sec:    13987, Lr: 0.000300\n",
            "2019-12-31 07:31:40,484 Example #0\n",
            "2019-12-31 07:31:40,484 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 07:31:40,484 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 07:31:40,484 \tHypothesis: A mrẹ ọnana vwẹ idjerhe tiọna , kidie ayen muegbe rẹ ayen vwo nene odjekẹ rẹ Jihova .\n",
            "2019-12-31 07:31:40,484 Example #1\n",
            "2019-12-31 07:31:40,484 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 07:31:40,484 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:31:40,484 \tHypothesis: Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:31:40,485 Example #2\n",
            "2019-12-31 07:31:40,485 \tSource:     But freedom from what ?\n",
            "2019-12-31 07:31:40,485 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 07:31:40,485 \tHypothesis: Kẹ egbomọphẹ vọ yen a vwọ kẹ ayen ?\n",
            "2019-12-31 07:31:40,485 Example #3\n",
            "2019-12-31 07:31:40,485 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 07:31:40,485 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 07:31:40,485 \tHypothesis: Wọ vwẹroso ukoko wẹn jovwo re .\n",
            "2019-12-31 07:31:40,485 Validation result (greedy) at epoch  49, step    13000: bleu:  13.26, loss: 56796.0586, ppl:  15.1171, duration: 33.7130s\n",
            "2019-12-31 07:31:55,255 Epoch  49 Step:    13100 Batch Loss:     1.389380 Tokens per Sec:    13469, Lr: 0.000210\n",
            "2019-12-31 07:32:10,009 Epoch  49 Step:    13200 Batch Loss:     1.064081 Tokens per Sec:    13524, Lr: 0.000210\n",
            "2019-12-31 07:32:15,387 Epoch  49: total training loss 323.91\n",
            "2019-12-31 07:32:15,387 EPOCH 50\n",
            "2019-12-31 07:32:24,759 Epoch  50 Step:    13300 Batch Loss:     0.791446 Tokens per Sec:    13819, Lr: 0.000210\n",
            "2019-12-31 07:32:39,696 Epoch  50 Step:    13400 Batch Loss:     1.359972 Tokens per Sec:    13756, Lr: 0.000210\n",
            "2019-12-31 07:32:54,375 Epoch  50 Step:    13500 Batch Loss:     1.257862 Tokens per Sec:    13295, Lr: 0.000210\n",
            "2019-12-31 07:32:55,532 Epoch  50: total training loss 316.78\n",
            "2019-12-31 07:32:55,532 EPOCH 51\n",
            "2019-12-31 07:33:09,273 Epoch  51 Step:    13600 Batch Loss:     1.332123 Tokens per Sec:    13406, Lr: 0.000210\n",
            "2019-12-31 07:33:23,968 Epoch  51 Step:    13700 Batch Loss:     1.320551 Tokens per Sec:    13654, Lr: 0.000210\n",
            "2019-12-31 07:33:35,646 Epoch  51: total training loss 310.94\n",
            "2019-12-31 07:33:35,646 EPOCH 52\n",
            "2019-12-31 07:33:38,791 Epoch  52 Step:    13800 Batch Loss:     1.326518 Tokens per Sec:    13637, Lr: 0.000210\n",
            "2019-12-31 07:33:53,612 Epoch  52 Step:    13900 Batch Loss:     0.786145 Tokens per Sec:    13392, Lr: 0.000210\n",
            "2019-12-31 07:34:08,496 Epoch  52 Step:    14000 Batch Loss:     1.086468 Tokens per Sec:    13691, Lr: 0.000210\n",
            "2019-12-31 07:34:42,326 Example #0\n",
            "2019-12-31 07:34:42,326 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 07:34:42,326 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 07:34:42,326 \tHypothesis: Enana nẹrhẹ ayen muegbe phiyotọ rere ayen se vwo muegbe rẹ iroro rẹ avwanre , ji nene odjekẹ rẹ Jihova .\n",
            "2019-12-31 07:34:42,326 Example #1\n",
            "2019-12-31 07:34:42,326 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 07:34:42,326 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:34:42,326 \tHypothesis: Enẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:34:42,326 Example #2\n",
            "2019-12-31 07:34:42,327 \tSource:     But freedom from what ?\n",
            "2019-12-31 07:34:42,327 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 07:34:42,327 \tHypothesis: Ẹkẹvuọvo , egbomọphẹ vọ yen o vwo ruo ?\n",
            "2019-12-31 07:34:42,327 Example #3\n",
            "2019-12-31 07:34:42,327 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 07:34:42,327 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 07:34:42,327 \tHypothesis: Wọ vwẹroso ukoko wẹn jovwo re .\n",
            "2019-12-31 07:34:42,327 Validation result (greedy) at epoch  52, step    14000: bleu:  13.94, loss: 57613.8438, ppl:  15.7199, duration: 33.8313s\n",
            "2019-12-31 07:34:49,612 Epoch  52: total training loss 306.30\n",
            "2019-12-31 07:34:49,613 EPOCH 53\n",
            "2019-12-31 07:34:57,183 Epoch  53 Step:    14100 Batch Loss:     1.331935 Tokens per Sec:    13640, Lr: 0.000210\n",
            "2019-12-31 07:35:12,018 Epoch  53 Step:    14200 Batch Loss:     0.785916 Tokens per Sec:    13685, Lr: 0.000210\n",
            "2019-12-31 07:35:26,934 Epoch  53 Step:    14300 Batch Loss:     1.049746 Tokens per Sec:    13668, Lr: 0.000210\n",
            "2019-12-31 07:35:29,572 Epoch  53: total training loss 302.14\n",
            "2019-12-31 07:35:29,572 EPOCH 54\n",
            "2019-12-31 07:35:41,774 Epoch  54 Step:    14400 Batch Loss:     1.237039 Tokens per Sec:    13656, Lr: 0.000210\n",
            "2019-12-31 07:35:56,680 Epoch  54 Step:    14500 Batch Loss:     1.054131 Tokens per Sec:    13559, Lr: 0.000210\n",
            "2019-12-31 07:36:09,812 Epoch  54: total training loss 300.16\n",
            "2019-12-31 07:36:09,813 EPOCH 55\n",
            "2019-12-31 07:36:11,531 Epoch  55 Step:    14600 Batch Loss:     1.191510 Tokens per Sec:    13756, Lr: 0.000210\n",
            "2019-12-31 07:36:26,345 Epoch  55 Step:    14700 Batch Loss:     1.276201 Tokens per Sec:    13744, Lr: 0.000210\n",
            "2019-12-31 07:36:41,171 Epoch  55 Step:    14800 Batch Loss:     1.238366 Tokens per Sec:    13368, Lr: 0.000210\n",
            "2019-12-31 07:36:50,167 Epoch  55: total training loss 297.80\n",
            "2019-12-31 07:36:50,167 EPOCH 56\n",
            "2019-12-31 07:36:56,088 Epoch  56 Step:    14900 Batch Loss:     0.694806 Tokens per Sec:    13567, Lr: 0.000210\n",
            "2019-12-31 07:37:10,996 Epoch  56 Step:    15000 Batch Loss:     0.932301 Tokens per Sec:    13844, Lr: 0.000210\n",
            "2019-12-31 07:37:44,803 Example #0\n",
            "2019-12-31 07:37:44,804 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 07:37:44,804 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 07:37:44,804 \tHypothesis: Enana nẹrhẹ ihwo buebun se muegbe rẹ iroro rayen , ayen me je nabọ nene odjekẹ rẹ ẹwẹn na .\n",
            "2019-12-31 07:37:44,804 Example #1\n",
            "2019-12-31 07:37:44,804 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 07:37:44,804 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:37:44,804 \tHypothesis: Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:37:44,804 Example #2\n",
            "2019-12-31 07:37:44,804 \tSource:     But freedom from what ?\n",
            "2019-12-31 07:37:44,804 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 07:37:44,805 \tHypothesis: Ẹkẹvuọvo , die yen egbomọphẹ ?\n",
            "2019-12-31 07:37:44,805 Example #3\n",
            "2019-12-31 07:37:44,805 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 07:37:44,805 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 07:37:44,805 \tHypothesis: Wọ je vwẹroso ukoko wẹn jovwo re .\n",
            "2019-12-31 07:37:44,805 Validation result (greedy) at epoch  56, step    15000: bleu:  13.85, loss: 58165.7656, ppl:  16.1403, duration: 33.8089s\n",
            "2019-12-31 07:37:59,785 Epoch  56 Step:    15100 Batch Loss:     1.242236 Tokens per Sec:    13386, Lr: 0.000210\n",
            "2019-12-31 07:38:04,108 Epoch  56: total training loss 289.53\n",
            "2019-12-31 07:38:04,108 EPOCH 57\n",
            "2019-12-31 07:38:14,736 Epoch  57 Step:    15200 Batch Loss:     1.003102 Tokens per Sec:    13638, Lr: 0.000210\n",
            "2019-12-31 07:38:29,607 Epoch  57 Step:    15300 Batch Loss:     1.058721 Tokens per Sec:    13589, Lr: 0.000210\n",
            "2019-12-31 07:38:44,320 Epoch  57: total training loss 288.69\n",
            "2019-12-31 07:38:44,320 EPOCH 58\n",
            "2019-12-31 07:38:44,516 Epoch  58 Step:    15400 Batch Loss:     1.080109 Tokens per Sec:    12250, Lr: 0.000210\n",
            "2019-12-31 07:38:59,397 Epoch  58 Step:    15500 Batch Loss:     1.339564 Tokens per Sec:    13319, Lr: 0.000210\n",
            "2019-12-31 07:39:14,251 Epoch  58 Step:    15600 Batch Loss:     1.139009 Tokens per Sec:    13703, Lr: 0.000210\n",
            "2019-12-31 07:39:24,965 Epoch  58: total training loss 287.31\n",
            "2019-12-31 07:39:24,965 EPOCH 59\n",
            "2019-12-31 07:39:29,128 Epoch  59 Step:    15700 Batch Loss:     1.161043 Tokens per Sec:    12641, Lr: 0.000210\n",
            "2019-12-31 07:39:44,108 Epoch  59 Step:    15800 Batch Loss:     1.135937 Tokens per Sec:    13509, Lr: 0.000210\n",
            "2019-12-31 07:39:58,904 Epoch  59 Step:    15900 Batch Loss:     0.949926 Tokens per Sec:    13314, Lr: 0.000210\n",
            "2019-12-31 07:40:05,583 Epoch  59: total training loss 286.00\n",
            "2019-12-31 07:40:05,584 EPOCH 60\n",
            "2019-12-31 07:40:13,830 Epoch  60 Step:    16000 Batch Loss:     1.321432 Tokens per Sec:    13721, Lr: 0.000210\n",
            "2019-12-31 07:40:47,531 Example #0\n",
            "2019-12-31 07:40:47,531 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 07:40:47,531 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 07:40:47,531 \tHypothesis: Osichọ nana nẹrhẹ ihwo efa se muegbe rẹ ubiudu avwanre , ayen me je nabọ nene odjekẹ rẹ avwanre .\n",
            "2019-12-31 07:40:47,531 Example #1\n",
            "2019-12-31 07:40:47,531 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 07:40:47,531 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:40:47,532 \tHypothesis: Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:40:47,532 Example #2\n",
            "2019-12-31 07:40:47,532 \tSource:     But freedom from what ?\n",
            "2019-12-31 07:40:47,532 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 07:40:47,532 \tHypothesis: Kẹ egbomọphẹ vọ yen o vwo ruo ?\n",
            "2019-12-31 07:40:47,532 Example #3\n",
            "2019-12-31 07:40:47,532 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 07:40:47,532 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 07:40:47,532 \tHypothesis: Wọ vwẹroso ukoko wẹn jovwo re .\n",
            "2019-12-31 07:40:47,532 Validation result (greedy) at epoch  60, step    16000: bleu:  14.43, loss: 58579.1992, ppl:  16.4626, duration: 33.7021s\n",
            "2019-12-31 07:41:02,310 Epoch  60 Step:    16100 Batch Loss:     0.775197 Tokens per Sec:    13268, Lr: 0.000210\n",
            "2019-12-31 07:41:17,105 Epoch  60 Step:    16200 Batch Loss:     1.118505 Tokens per Sec:    13730, Lr: 0.000210\n",
            "2019-12-31 07:41:19,602 Epoch  60: total training loss 281.15\n",
            "2019-12-31 07:41:19,602 EPOCH 61\n",
            "2019-12-31 07:41:31,965 Epoch  61 Step:    16300 Batch Loss:     1.093107 Tokens per Sec:    13611, Lr: 0.000210\n",
            "2019-12-31 07:41:46,584 Epoch  61 Step:    16400 Batch Loss:     0.663107 Tokens per Sec:    13746, Lr: 0.000210\n",
            "2019-12-31 07:41:59,520 Epoch  61: total training loss 277.93\n",
            "2019-12-31 07:41:59,520 EPOCH 62\n",
            "2019-12-31 07:42:01,338 Epoch  62 Step:    16500 Batch Loss:     0.502820 Tokens per Sec:    13691, Lr: 0.000210\n",
            "2019-12-31 07:42:16,126 Epoch  62 Step:    16600 Batch Loss:     1.289075 Tokens per Sec:    13521, Lr: 0.000210\n",
            "2019-12-31 07:42:31,028 Epoch  62 Step:    16700 Batch Loss:     1.139260 Tokens per Sec:    13888, Lr: 0.000210\n",
            "2019-12-31 07:42:39,467 Epoch  62: total training loss 271.96\n",
            "2019-12-31 07:42:39,467 EPOCH 63\n",
            "2019-12-31 07:42:45,900 Epoch  63 Step:    16800 Batch Loss:     1.126073 Tokens per Sec:    13210, Lr: 0.000210\n",
            "2019-12-31 07:43:00,664 Epoch  63 Step:    16900 Batch Loss:     0.861230 Tokens per Sec:    13775, Lr: 0.000210\n",
            "2019-12-31 07:43:15,617 Epoch  63 Step:    17000 Batch Loss:     1.107252 Tokens per Sec:    13746, Lr: 0.000210\n",
            "2019-12-31 07:43:49,350 Example #0\n",
            "2019-12-31 07:43:49,351 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 07:43:49,351 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 07:43:49,351 \tHypothesis: Ebẹnbẹn nana nẹrhẹ ayen se muegbe rẹ ubiudu rayen eje , ji muegbe rẹ oborẹ avwanre seri .\n",
            "2019-12-31 07:43:49,351 Example #1\n",
            "2019-12-31 07:43:49,351 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 07:43:49,351 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:43:49,351 \tHypothesis: Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:43:49,351 Example #2\n",
            "2019-12-31 07:43:49,351 \tSource:     But freedom from what ?\n",
            "2019-12-31 07:43:49,351 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 07:43:49,351 \tHypothesis: Ẹkẹvuọvo , die yen egbomọphẹ ?\n",
            "2019-12-31 07:43:49,352 Example #3\n",
            "2019-12-31 07:43:49,352 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 07:43:49,352 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 07:43:49,352 \tHypothesis: Wọ je guọnọ yan nene ukoko wẹn ọfa jovwo re .\n",
            "2019-12-31 07:43:49,352 Validation result (greedy) at epoch  63, step    17000: bleu:  14.35, loss: 59104.8047, ppl:  16.8816, duration: 33.7348s\n",
            "2019-12-31 07:43:53,277 Epoch  63: total training loss 270.66\n",
            "2019-12-31 07:43:53,277 EPOCH 64\n",
            "2019-12-31 07:44:04,195 Epoch  64 Step:    17100 Batch Loss:     0.870494 Tokens per Sec:    13742, Lr: 0.000210\n",
            "2019-12-31 07:44:18,938 Epoch  64 Step:    17200 Batch Loss:     0.613872 Tokens per Sec:    13650, Lr: 0.000210\n",
            "2019-12-31 07:44:33,162 Epoch  64: total training loss 267.55\n",
            "2019-12-31 07:44:33,162 EPOCH 65\n",
            "2019-12-31 07:44:33,629 Epoch  65 Step:    17300 Batch Loss:     0.944022 Tokens per Sec:    11031, Lr: 0.000210\n",
            "2019-12-31 07:44:48,454 Epoch  65 Step:    17400 Batch Loss:     1.272358 Tokens per Sec:    13888, Lr: 0.000210\n",
            "2019-12-31 07:45:03,180 Epoch  65 Step:    17500 Batch Loss:     1.089397 Tokens per Sec:    13364, Lr: 0.000210\n",
            "2019-12-31 07:45:13,125 Epoch  65: total training loss 264.64\n",
            "2019-12-31 07:45:13,125 EPOCH 66\n",
            "2019-12-31 07:45:18,055 Epoch  66 Step:    17600 Batch Loss:     1.153565 Tokens per Sec:    13187, Lr: 0.000210\n",
            "2019-12-31 07:45:32,883 Epoch  66 Step:    17700 Batch Loss:     1.080082 Tokens per Sec:    13863, Lr: 0.000210\n",
            "2019-12-31 07:45:47,669 Epoch  66 Step:    17800 Batch Loss:     1.123865 Tokens per Sec:    13598, Lr: 0.000210\n",
            "2019-12-31 07:45:53,157 Epoch  66: total training loss 263.13\n",
            "2019-12-31 07:45:53,157 EPOCH 67\n",
            "2019-12-31 07:46:02,589 Epoch  67 Step:    17900 Batch Loss:     0.950385 Tokens per Sec:    13734, Lr: 0.000210\n",
            "2019-12-31 07:46:17,427 Epoch  67 Step:    18000 Batch Loss:     0.714562 Tokens per Sec:    13507, Lr: 0.000210\n",
            "2019-12-31 07:46:51,215 Example #0\n",
            "2019-12-31 07:46:51,216 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 07:46:51,216 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 07:46:51,216 \tHypothesis: Oyan nana nẹrhẹ ayen se muegbe rẹ iroro vẹ iruo rẹ avwanre , ayen me je nabọ nene odjekẹ na .\n",
            "2019-12-31 07:46:51,216 Example #1\n",
            "2019-12-31 07:46:51,216 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 07:46:51,216 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:46:51,216 \tHypothesis: Enẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:46:51,216 Example #2\n",
            "2019-12-31 07:46:51,216 \tSource:     But freedom from what ?\n",
            "2019-12-31 07:46:51,216 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 07:46:51,216 \tHypothesis: Kẹ egbomọphẹ vọ yen o vwo ruo ?\n",
            "2019-12-31 07:46:51,216 Example #3\n",
            "2019-12-31 07:46:51,217 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 07:46:51,217 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 07:46:51,217 \tHypothesis: Wọ je vwẹroso ukoko wẹn jovwo re .\n",
            "2019-12-31 07:46:51,217 Validation result (greedy) at epoch  67, step    18000: bleu:  13.90, loss: 59971.5781, ppl:  17.5960, duration: 33.7899s\n",
            "2019-12-31 07:47:06,104 Epoch  67 Step:    18100 Batch Loss:     0.912966 Tokens per Sec:    13543, Lr: 0.000210\n",
            "2019-12-31 07:47:07,160 Epoch  67: total training loss 260.01\n",
            "2019-12-31 07:47:07,160 EPOCH 68\n",
            "2019-12-31 07:47:21,057 Epoch  68 Step:    18200 Batch Loss:     1.043700 Tokens per Sec:    13453, Lr: 0.000210\n",
            "2019-12-31 07:47:36,048 Epoch  68 Step:    18300 Batch Loss:     0.670840 Tokens per Sec:    13918, Lr: 0.000210\n",
            "2019-12-31 07:47:47,234 Epoch  68: total training loss 255.27\n",
            "2019-12-31 07:47:47,234 EPOCH 69\n",
            "2019-12-31 07:47:51,020 Epoch  69 Step:    18400 Batch Loss:     1.011232 Tokens per Sec:    13661, Lr: 0.000210\n",
            "2019-12-31 07:48:05,847 Epoch  69 Step:    18500 Batch Loss:     1.036424 Tokens per Sec:    13554, Lr: 0.000210\n",
            "2019-12-31 07:48:20,864 Epoch  69 Step:    18600 Batch Loss:     1.124365 Tokens per Sec:    13382, Lr: 0.000210\n",
            "2019-12-31 07:48:27,453 Epoch  69: total training loss 253.82\n",
            "2019-12-31 07:48:27,453 EPOCH 70\n",
            "2019-12-31 07:48:35,817 Epoch  70 Step:    18700 Batch Loss:     0.772242 Tokens per Sec:    13317, Lr: 0.000210\n",
            "2019-12-31 07:48:50,724 Epoch  70 Step:    18800 Batch Loss:     1.204297 Tokens per Sec:    13848, Lr: 0.000210\n",
            "2019-12-31 07:49:05,593 Epoch  70 Step:    18900 Batch Loss:     1.094892 Tokens per Sec:    13427, Lr: 0.000210\n",
            "2019-12-31 07:49:07,563 Epoch  70: total training loss 251.40\n",
            "2019-12-31 07:49:07,564 EPOCH 71\n",
            "2019-12-31 07:49:20,587 Epoch  71 Step:    19000 Batch Loss:     1.145924 Tokens per Sec:    13584, Lr: 0.000210\n",
            "2019-12-31 07:49:54,405 Example #0\n",
            "2019-12-31 07:49:54,405 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 07:49:54,405 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 07:49:54,405 \tHypothesis: Enana nẹrhẹ ihwo efa muegbe rẹ ayen vwo muegbe rẹ obo rehẹ ubiudu avwanre , je nene odjekẹ na .\n",
            "2019-12-31 07:49:54,405 Example #1\n",
            "2019-12-31 07:49:54,405 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 07:49:54,405 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:49:54,405 \tHypothesis: Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:49:54,405 Example #2\n",
            "2019-12-31 07:49:54,406 \tSource:     But freedom from what ?\n",
            "2019-12-31 07:49:54,406 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 07:49:54,406 \tHypothesis: Kẹ egbomọphẹ vọ yen o vwo ruo ?\n",
            "2019-12-31 07:49:54,406 Example #3\n",
            "2019-12-31 07:49:54,406 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 07:49:54,406 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 07:49:54,406 \tHypothesis: Wọ je guọnọ ukoko wẹn ọfa .\n",
            "2019-12-31 07:49:54,406 Validation result (greedy) at epoch  71, step    19000: bleu:  14.10, loss: 60328.6914, ppl:  17.8990, duration: 33.8183s\n",
            "2019-12-31 07:50:09,371 Epoch  71 Step:    19100 Batch Loss:     1.050977 Tokens per Sec:    13614, Lr: 0.000147\n",
            "2019-12-31 07:50:21,724 Epoch  71: total training loss 246.91\n",
            "2019-12-31 07:50:21,724 EPOCH 72\n",
            "2019-12-31 07:50:24,262 Epoch  72 Step:    19200 Batch Loss:     0.450549 Tokens per Sec:    12690, Lr: 0.000147\n",
            "2019-12-31 07:50:39,234 Epoch  72 Step:    19300 Batch Loss:     0.853629 Tokens per Sec:    13687, Lr: 0.000147\n",
            "2019-12-31 07:50:54,168 Epoch  72 Step:    19400 Batch Loss:     0.891482 Tokens per Sec:    13935, Lr: 0.000147\n",
            "2019-12-31 07:51:01,721 Epoch  72: total training loss 239.45\n",
            "2019-12-31 07:51:01,721 EPOCH 73\n",
            "2019-12-31 07:51:08,985 Epoch  73 Step:    19500 Batch Loss:     1.079561 Tokens per Sec:    13156, Lr: 0.000147\n",
            "2019-12-31 07:51:23,955 Epoch  73 Step:    19600 Batch Loss:     1.002316 Tokens per Sec:    13846, Lr: 0.000147\n",
            "2019-12-31 07:51:38,831 Epoch  73 Step:    19700 Batch Loss:     0.988723 Tokens per Sec:    13756, Lr: 0.000147\n",
            "2019-12-31 07:51:41,769 Epoch  73: total training loss 239.19\n",
            "2019-12-31 07:51:41,769 EPOCH 74\n",
            "2019-12-31 07:51:53,585 Epoch  74 Step:    19800 Batch Loss:     0.640235 Tokens per Sec:    13263, Lr: 0.000147\n",
            "2019-12-31 07:52:08,590 Epoch  74 Step:    19900 Batch Loss:     1.022838 Tokens per Sec:    13644, Lr: 0.000147\n",
            "2019-12-31 07:52:22,155 Epoch  74: total training loss 238.37\n",
            "2019-12-31 07:52:22,156 EPOCH 75\n",
            "2019-12-31 07:52:23,374 Epoch  75 Step:    20000 Batch Loss:     0.906945 Tokens per Sec:    12966, Lr: 0.000147\n",
            "2019-12-31 07:52:57,184 Example #0\n",
            "2019-12-31 07:52:57,184 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 07:52:57,184 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 07:52:57,184 \tHypothesis: Enana nẹrhẹ ihwo efa se muegbe rẹ ayen vwo muegbe rẹ obo rehẹ ubiudu avwanre , ji nene odjekẹ na .\n",
            "2019-12-31 07:52:57,185 Example #1\n",
            "2019-12-31 07:52:57,185 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 07:52:57,185 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:52:57,185 \tHypothesis: Asaọkiephana , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:52:57,185 Example #2\n",
            "2019-12-31 07:52:57,185 \tSource:     But freedom from what ?\n",
            "2019-12-31 07:52:57,185 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 07:52:57,185 \tHypothesis: Kẹ egbomọphẹ vọ yen o vwo ruo ?\n",
            "2019-12-31 07:52:57,185 Example #3\n",
            "2019-12-31 07:52:57,185 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 07:52:57,185 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 07:52:57,185 \tHypothesis: Wọ je guọnọ nẹ ukoko wẹn vẹ ukoko wẹn ro jovwo na .\n",
            "2019-12-31 07:52:57,186 Validation result (greedy) at epoch  75, step    20000: bleu:  14.55, loss: 60525.4727, ppl:  18.0682, duration: 33.8115s\n",
            "2019-12-31 07:53:12,078 Epoch  75 Step:    20100 Batch Loss:     1.026942 Tokens per Sec:    13675, Lr: 0.000147\n",
            "2019-12-31 07:53:26,775 Epoch  75 Step:    20200 Batch Loss:     1.028158 Tokens per Sec:    13415, Lr: 0.000147\n",
            "2019-12-31 07:53:36,291 Epoch  75: total training loss 235.79\n",
            "2019-12-31 07:53:36,291 EPOCH 76\n",
            "2019-12-31 07:53:41,639 Epoch  76 Step:    20300 Batch Loss:     0.872331 Tokens per Sec:    13056, Lr: 0.000147\n",
            "2019-12-31 07:53:56,459 Epoch  76 Step:    20400 Batch Loss:     0.974461 Tokens per Sec:    13431, Lr: 0.000147\n",
            "2019-12-31 07:54:11,488 Epoch  76 Step:    20500 Batch Loss:     0.923049 Tokens per Sec:    13733, Lr: 0.000147\n",
            "2019-12-31 07:54:16,788 Epoch  76: total training loss 234.84\n",
            "2019-12-31 07:54:16,788 EPOCH 77\n",
            "2019-12-31 07:54:26,426 Epoch  77 Step:    20600 Batch Loss:     0.581441 Tokens per Sec:    13485, Lr: 0.000147\n",
            "2019-12-31 07:54:41,404 Epoch  77 Step:    20700 Batch Loss:     0.839941 Tokens per Sec:    13588, Lr: 0.000147\n",
            "2019-12-31 07:54:56,243 Epoch  77 Step:    20800 Batch Loss:     0.593163 Tokens per Sec:    13439, Lr: 0.000147\n",
            "2019-12-31 07:54:57,168 Epoch  77: total training loss 230.64\n",
            "2019-12-31 07:54:57,168 EPOCH 78\n",
            "2019-12-31 07:55:11,224 Epoch  78 Step:    20900 Batch Loss:     0.635312 Tokens per Sec:    13392, Lr: 0.000147\n",
            "2019-12-31 07:55:26,134 Epoch  78 Step:    21000 Batch Loss:     0.955511 Tokens per Sec:    13504, Lr: 0.000147\n",
            "2019-12-31 07:55:59,934 Example #0\n",
            "2019-12-31 07:55:59,935 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 07:55:59,935 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 07:55:59,935 \tHypothesis: Ihoho nana nẹrhẹ ayen se muegbe rẹ ubiudu rayen , ayen me je nabọ muegbe rẹ oborẹ avwanre yonori .\n",
            "2019-12-31 07:55:59,935 Example #1\n",
            "2019-12-31 07:55:59,935 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 07:55:59,935 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:55:59,935 \tHypothesis: Asaọkiephana , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:55:59,935 Example #2\n",
            "2019-12-31 07:55:59,935 \tSource:     But freedom from what ?\n",
            "2019-12-31 07:55:59,935 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 07:55:59,936 \tHypothesis: Kẹ egbomọphẹ vọ yen o vwo ruo ?\n",
            "2019-12-31 07:55:59,936 Example #3\n",
            "2019-12-31 07:55:59,936 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 07:55:59,936 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 07:55:59,936 \tHypothesis: Wọ je guọnọ nẹ ukoko wẹn yovwin nọ wẹ nure .\n",
            "2019-12-31 07:55:59,936 Validation result (greedy) at epoch  78, step    21000: bleu:  14.48, loss: 61023.7109, ppl:  18.5039, duration: 33.8013s\n",
            "2019-12-31 07:56:11,388 Epoch  78: total training loss 230.17\n",
            "2019-12-31 07:56:11,388 EPOCH 79\n",
            "2019-12-31 07:56:14,832 Epoch  79 Step:    21100 Batch Loss:     0.634400 Tokens per Sec:    13166, Lr: 0.000147\n",
            "2019-12-31 07:56:29,734 Epoch  79 Step:    21200 Batch Loss:     0.793637 Tokens per Sec:    13534, Lr: 0.000147\n",
            "2019-12-31 07:56:44,726 Epoch  79 Step:    21300 Batch Loss:     0.754804 Tokens per Sec:    13533, Lr: 0.000147\n",
            "2019-12-31 07:56:51,738 Epoch  79: total training loss 227.47\n",
            "2019-12-31 07:56:51,738 EPOCH 80\n",
            "2019-12-31 07:56:59,622 Epoch  80 Step:    21400 Batch Loss:     0.845221 Tokens per Sec:    13202, Lr: 0.000147\n",
            "2019-12-31 07:57:14,470 Epoch  80 Step:    21500 Batch Loss:     0.930473 Tokens per Sec:    13504, Lr: 0.000147\n",
            "2019-12-31 07:57:29,341 Epoch  80 Step:    21600 Batch Loss:     0.399067 Tokens per Sec:    13554, Lr: 0.000147\n",
            "2019-12-31 07:57:32,338 Epoch  80: total training loss 227.87\n",
            "2019-12-31 07:57:32,339 EPOCH 81\n",
            "2019-12-31 07:57:44,336 Epoch  81 Step:    21700 Batch Loss:     1.001035 Tokens per Sec:    13749, Lr: 0.000147\n",
            "2019-12-31 07:57:59,179 Epoch  81 Step:    21800 Batch Loss:     0.795596 Tokens per Sec:    13464, Lr: 0.000147\n",
            "2019-12-31 07:58:12,574 Epoch  81: total training loss 224.77\n",
            "2019-12-31 07:58:12,574 EPOCH 82\n",
            "2019-12-31 07:58:14,125 Epoch  82 Step:    21900 Batch Loss:     0.872754 Tokens per Sec:    13425, Lr: 0.000147\n",
            "2019-12-31 07:58:29,065 Epoch  82 Step:    22000 Batch Loss:     0.965728 Tokens per Sec:    13633, Lr: 0.000147\n",
            "2019-12-31 07:59:02,907 Example #0\n",
            "2019-12-31 07:59:02,907 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 07:59:02,907 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 07:59:02,907 \tHypothesis: Ihoho nana nẹrhẹ ayen se muegbe rẹ ubiudu avwanre , ji nene odjekẹ rẹ avwanre .\n",
            "2019-12-31 07:59:02,907 Example #1\n",
            "2019-12-31 07:59:02,907 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 07:59:02,907 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:59:02,907 \tHypothesis: Asaọkiephana , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 07:59:02,908 Example #2\n",
            "2019-12-31 07:59:02,908 \tSource:     But freedom from what ?\n",
            "2019-12-31 07:59:02,908 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 07:59:02,908 \tHypothesis: Kẹ egbomọphẹ vọ yen o vwo ruo ?\n",
            "2019-12-31 07:59:02,908 Example #3\n",
            "2019-12-31 07:59:02,908 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 07:59:02,908 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 07:59:02,908 \tHypothesis: Wọ je vwẹroso ukoko wẹn jovwo re .\n",
            "2019-12-31 07:59:02,908 Validation result (greedy) at epoch  82, step    22000: bleu:  14.41, loss: 61665.9570, ppl:  19.0809, duration: 33.8427s\n",
            "2019-12-31 07:59:17,755 Epoch  82 Step:    22100 Batch Loss:     1.046957 Tokens per Sec:    13367, Lr: 0.000147\n",
            "2019-12-31 07:59:26,828 Epoch  82: total training loss 223.08\n",
            "2019-12-31 07:59:26,828 EPOCH 83\n",
            "2019-12-31 07:59:32,698 Epoch  83 Step:    22200 Batch Loss:     0.395410 Tokens per Sec:    13809, Lr: 0.000147\n",
            "2019-12-31 07:59:47,606 Epoch  83 Step:    22300 Batch Loss:     0.757746 Tokens per Sec:    13413, Lr: 0.000147\n",
            "2019-12-31 08:00:02,660 Epoch  83 Step:    22400 Batch Loss:     0.406432 Tokens per Sec:    13740, Lr: 0.000147\n",
            "2019-12-31 08:00:07,195 Epoch  83: total training loss 222.05\n",
            "2019-12-31 08:00:07,195 EPOCH 84\n",
            "2019-12-31 08:00:17,540 Epoch  84 Step:    22500 Batch Loss:     0.903960 Tokens per Sec:    13877, Lr: 0.000147\n",
            "2019-12-31 08:00:32,503 Epoch  84 Step:    22600 Batch Loss:     0.983404 Tokens per Sec:    13303, Lr: 0.000147\n",
            "2019-12-31 08:00:47,450 Epoch  84 Step:    22700 Batch Loss:     0.904634 Tokens per Sec:    13579, Lr: 0.000147\n",
            "2019-12-31 08:00:47,451 Epoch  84: total training loss 218.51\n",
            "2019-12-31 08:00:47,451 EPOCH 85\n",
            "2019-12-31 08:01:02,503 Epoch  85 Step:    22800 Batch Loss:     0.816896 Tokens per Sec:    13549, Lr: 0.000147\n",
            "2019-12-31 08:01:17,350 Epoch  85 Step:    22900 Batch Loss:     0.859011 Tokens per Sec:    13531, Lr: 0.000147\n",
            "2019-12-31 08:01:27,849 Epoch  85: total training loss 218.53\n",
            "2019-12-31 08:01:27,849 EPOCH 86\n",
            "2019-12-31 08:01:32,229 Epoch  86 Step:    23000 Batch Loss:     0.690743 Tokens per Sec:    13111, Lr: 0.000147\n",
            "2019-12-31 08:02:06,068 Example #0\n",
            "2019-12-31 08:02:06,068 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 08:02:06,068 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 08:02:06,068 \tHypothesis: Enana nẹrhẹ ihwo ni opharo rẹ avwanre , ayen muegbe rẹ ayen vwo nene odjekẹ rẹ avwanre .\n",
            "2019-12-31 08:02:06,068 Example #1\n",
            "2019-12-31 08:02:06,068 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 08:02:06,068 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 08:02:06,068 \tHypothesis: Asaọkiephana , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 08:02:06,068 Example #2\n",
            "2019-12-31 08:02:06,068 \tSource:     But freedom from what ?\n",
            "2019-12-31 08:02:06,069 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 08:02:06,069 \tHypothesis: Kẹ egbomọphẹ vọ yen o vwo ruo ?\n",
            "2019-12-31 08:02:06,069 Example #3\n",
            "2019-12-31 08:02:06,069 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 08:02:06,069 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 08:02:06,069 \tHypothesis: Wọ je guọnọ nẹ ukoko wẹn yovwin nọ wẹ nure .\n",
            "2019-12-31 08:02:06,069 Validation result (greedy) at epoch  86, step    23000: bleu:  14.13, loss: 61697.9805, ppl:  19.1102, duration: 33.8394s\n",
            "2019-12-31 08:02:20,911 Epoch  86 Step:    23100 Batch Loss:     0.723428 Tokens per Sec:    13765, Lr: 0.000147\n",
            "2019-12-31 08:02:35,758 Epoch  86 Step:    23200 Batch Loss:     0.777151 Tokens per Sec:    13440, Lr: 0.000147\n",
            "2019-12-31 08:02:41,741 Epoch  86: total training loss 216.06\n",
            "2019-12-31 08:02:41,741 EPOCH 87\n",
            "2019-12-31 08:02:50,768 Epoch  87 Step:    23300 Batch Loss:     0.904117 Tokens per Sec:    13317, Lr: 0.000147\n",
            "2019-12-31 08:03:05,718 Epoch  87 Step:    23400 Batch Loss:     0.684742 Tokens per Sec:    13432, Lr: 0.000147\n",
            "2019-12-31 08:03:20,612 Epoch  87 Step:    23500 Batch Loss:     0.683818 Tokens per Sec:    13709, Lr: 0.000147\n",
            "2019-12-31 08:03:22,095 Epoch  87: total training loss 215.53\n",
            "2019-12-31 08:03:22,095 EPOCH 88\n",
            "2019-12-31 08:03:35,457 Epoch  88 Step:    23600 Batch Loss:     0.964557 Tokens per Sec:    13396, Lr: 0.000147\n",
            "2019-12-31 08:03:50,390 Epoch  88 Step:    23700 Batch Loss:     0.859948 Tokens per Sec:    13891, Lr: 0.000147\n",
            "2019-12-31 08:04:02,077 Epoch  88: total training loss 212.63\n",
            "2019-12-31 08:04:02,077 EPOCH 89\n",
            "2019-12-31 08:04:05,457 Epoch  89 Step:    23800 Batch Loss:     0.803128 Tokens per Sec:    13798, Lr: 0.000147\n",
            "2019-12-31 08:04:20,364 Epoch  89 Step:    23900 Batch Loss:     0.681111 Tokens per Sec:    13352, Lr: 0.000147\n",
            "2019-12-31 08:04:35,226 Epoch  89 Step:    24000 Batch Loss:     1.011402 Tokens per Sec:    13921, Lr: 0.000147\n",
            "2019-12-31 08:05:09,041 Example #0\n",
            "2019-12-31 08:05:09,041 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 08:05:09,042 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 08:05:09,042 \tHypothesis: Omaẹkparọ nẹrhẹ ihwo efa muegbe rẹ ayen vwo muegbe rẹ obo rehẹ ubiudu avwanre , ji nene odjekẹ na .\n",
            "2019-12-31 08:05:09,042 Example #1\n",
            "2019-12-31 08:05:09,042 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 08:05:09,042 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 08:05:09,042 \tHypothesis: Asaọkiephana , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 08:05:09,042 Example #2\n",
            "2019-12-31 08:05:09,042 \tSource:     But freedom from what ?\n",
            "2019-12-31 08:05:09,042 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 08:05:09,042 \tHypothesis: Kẹ egbomọphẹ vọ yen o vwo ruo ?\n",
            "2019-12-31 08:05:09,042 Example #3\n",
            "2019-12-31 08:05:09,043 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 08:05:09,043 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 08:05:09,043 \tHypothesis: Wọ je guọnọ nẹ ukoko wẹn yovwin nọ wẹ nure .\n",
            "2019-12-31 08:05:09,043 Validation result (greedy) at epoch  89, step    24000: bleu:  14.51, loss: 62261.5195, ppl:  19.6321, duration: 33.8165s\n",
            "2019-12-31 08:05:16,275 Epoch  89: total training loss 212.92\n",
            "2019-12-31 08:05:16,275 EPOCH 90\n",
            "2019-12-31 08:05:23,913 Epoch  90 Step:    24100 Batch Loss:     0.698223 Tokens per Sec:    13657, Lr: 0.000147\n",
            "2019-12-31 08:05:38,865 Epoch  90 Step:    24200 Batch Loss:     0.823427 Tokens per Sec:    13556, Lr: 0.000147\n",
            "2019-12-31 08:05:53,847 Epoch  90 Step:    24300 Batch Loss:     0.494558 Tokens per Sec:    13554, Lr: 0.000147\n",
            "2019-12-31 08:05:56,501 Epoch  90: total training loss 210.07\n",
            "2019-12-31 08:05:56,501 EPOCH 91\n",
            "2019-12-31 08:06:08,772 Epoch  91 Step:    24400 Batch Loss:     0.809604 Tokens per Sec:    13435, Lr: 0.000147\n",
            "2019-12-31 08:06:23,745 Epoch  91 Step:    24500 Batch Loss:     0.923940 Tokens per Sec:    13606, Lr: 0.000147\n",
            "2019-12-31 08:06:36,915 Epoch  91: total training loss 210.43\n",
            "2019-12-31 08:06:36,915 EPOCH 92\n",
            "2019-12-31 08:06:38,647 Epoch  92 Step:    24600 Batch Loss:     0.761108 Tokens per Sec:    14282, Lr: 0.000147\n",
            "2019-12-31 08:06:53,544 Epoch  92 Step:    24700 Batch Loss:     0.954293 Tokens per Sec:    13616, Lr: 0.000147\n",
            "2019-12-31 08:07:08,481 Epoch  92 Step:    24800 Batch Loss:     0.939290 Tokens per Sec:    13480, Lr: 0.000147\n",
            "2019-12-31 08:07:17,137 Epoch  92: total training loss 207.64\n",
            "2019-12-31 08:07:17,137 EPOCH 93\n",
            "2019-12-31 08:07:23,413 Epoch  93 Step:    24900 Batch Loss:     0.868608 Tokens per Sec:    13154, Lr: 0.000147\n",
            "2019-12-31 08:07:38,392 Epoch  93 Step:    25000 Batch Loss:     0.356983 Tokens per Sec:    13460, Lr: 0.000147\n",
            "2019-12-31 08:08:12,162 Example #0\n",
            "2019-12-31 08:08:12,162 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 08:08:12,162 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 08:08:12,162 \tHypothesis: Ihoho nana nẹrhẹ ayen se muegbe rẹ ubiudu rayen phiyọ , ayen me je nabọ muegbe rẹ oborẹ avwanre yonori .\n",
            "2019-12-31 08:08:12,162 Example #1\n",
            "2019-12-31 08:08:12,163 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 08:08:12,163 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 08:08:12,163 \tHypothesis: Asaọkiephana , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 08:08:12,163 Example #2\n",
            "2019-12-31 08:08:12,163 \tSource:     But freedom from what ?\n",
            "2019-12-31 08:08:12,163 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 08:08:12,163 \tHypothesis: Die yen egbomọphẹ ?\n",
            "2019-12-31 08:08:12,163 Example #3\n",
            "2019-12-31 08:08:12,163 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 08:08:12,163 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 08:08:12,163 \tHypothesis: Wọ je guọnọ nẹ ukoko wẹn yovwin nọ wẹ nure .\n",
            "2019-12-31 08:08:12,163 Validation result (greedy) at epoch  93, step    25000: bleu:  14.49, loss: 62652.5859, ppl:  20.0027, duration: 33.7710s\n",
            "2019-12-31 08:08:27,031 Epoch  93 Step:    25100 Batch Loss:     0.833798 Tokens per Sec:    13794, Lr: 0.000103\n",
            "2019-12-31 08:08:31,179 Epoch  93: total training loss 206.05\n",
            "2019-12-31 08:08:31,180 EPOCH 94\n",
            "2019-12-31 08:08:41,985 Epoch  94 Step:    25200 Batch Loss:     0.896664 Tokens per Sec:    13432, Lr: 0.000103\n",
            "2019-12-31 08:08:56,905 Epoch  94 Step:    25300 Batch Loss:     0.770320 Tokens per Sec:    13398, Lr: 0.000103\n",
            "2019-12-31 08:09:11,453 Epoch  94: total training loss 201.40\n",
            "2019-12-31 08:09:11,453 EPOCH 95\n",
            "2019-12-31 08:09:11,794 Epoch  95 Step:    25400 Batch Loss:     0.753056 Tokens per Sec:    11766, Lr: 0.000103\n",
            "2019-12-31 08:09:26,696 Epoch  95 Step:    25500 Batch Loss:     0.646330 Tokens per Sec:    13487, Lr: 0.000103\n",
            "2019-12-31 08:09:41,555 Epoch  95 Step:    25600 Batch Loss:     0.880025 Tokens per Sec:    13754, Lr: 0.000103\n",
            "2019-12-31 08:09:51,630 Epoch  95: total training loss 200.13\n",
            "2019-12-31 08:09:51,631 EPOCH 96\n",
            "2019-12-31 08:09:56,507 Epoch  96 Step:    25700 Batch Loss:     0.580037 Tokens per Sec:    13308, Lr: 0.000103\n",
            "2019-12-31 08:10:11,289 Epoch  96 Step:    25800 Batch Loss:     0.790462 Tokens per Sec:    13633, Lr: 0.000103\n",
            "2019-12-31 08:10:26,149 Epoch  96 Step:    25900 Batch Loss:     0.689909 Tokens per Sec:    13501, Lr: 0.000103\n",
            "2019-12-31 08:10:31,984 Epoch  96: total training loss 199.20\n",
            "2019-12-31 08:10:31,984 EPOCH 97\n",
            "2019-12-31 08:10:41,191 Epoch  97 Step:    26000 Batch Loss:     0.800894 Tokens per Sec:    13551, Lr: 0.000103\n",
            "2019-12-31 08:11:15,027 Example #0\n",
            "2019-12-31 08:11:15,028 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 08:11:15,028 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 08:11:15,028 \tHypothesis: Ihoho nana churobọ si ubiudu rayen , ayen me je nabọ muegbe rẹ oborẹ avwanre yonori .\n",
            "2019-12-31 08:11:15,028 Example #1\n",
            "2019-12-31 08:11:15,028 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 08:11:15,028 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 08:11:15,028 \tHypothesis: Asaọkiephana , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 08:11:15,028 Example #2\n",
            "2019-12-31 08:11:15,028 \tSource:     But freedom from what ?\n",
            "2019-12-31 08:11:15,028 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 08:11:15,028 \tHypothesis: Kẹ egbomọphẹ vọ yen o vwo ruo ?\n",
            "2019-12-31 08:11:15,028 Example #3\n",
            "2019-12-31 08:11:15,028 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 08:11:15,028 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 08:11:15,028 \tHypothesis: Wọ je guọnọ nẹ ukoko wẹn yovwin nọ wẹ nure .\n",
            "2019-12-31 08:11:15,029 Validation result (greedy) at epoch  97, step    26000: bleu:  14.73, loss: 62752.0742, ppl:  20.0981, duration: 33.8377s\n",
            "2019-12-31 08:11:29,947 Epoch  97 Step:    26100 Batch Loss:     0.860063 Tokens per Sec:    13616, Lr: 0.000103\n",
            "2019-12-31 08:11:44,672 Epoch  97 Step:    26200 Batch Loss:     0.839487 Tokens per Sec:    13409, Lr: 0.000103\n",
            "2019-12-31 08:11:46,150 Epoch  97: total training loss 198.43\n",
            "2019-12-31 08:11:46,151 EPOCH 98\n",
            "2019-12-31 08:11:59,666 Epoch  98 Step:    26300 Batch Loss:     0.830938 Tokens per Sec:    13622, Lr: 0.000103\n",
            "2019-12-31 08:12:14,541 Epoch  98 Step:    26400 Batch Loss:     0.935780 Tokens per Sec:    13437, Lr: 0.000103\n",
            "2019-12-31 08:12:26,447 Epoch  98: total training loss 197.23\n",
            "2019-12-31 08:12:26,447 EPOCH 99\n",
            "2019-12-31 08:12:29,317 Epoch  99 Step:    26500 Batch Loss:     0.875571 Tokens per Sec:    13825, Lr: 0.000103\n",
            "2019-12-31 08:12:44,262 Epoch  99 Step:    26600 Batch Loss:     0.352891 Tokens per Sec:    13702, Lr: 0.000103\n",
            "2019-12-31 08:12:59,066 Epoch  99 Step:    26700 Batch Loss:     0.833609 Tokens per Sec:    13488, Lr: 0.000103\n",
            "2019-12-31 08:13:06,533 Epoch  99: total training loss 194.85\n",
            "2019-12-31 08:13:06,533 EPOCH 100\n",
            "2019-12-31 08:13:14,083 Epoch 100 Step:    26800 Batch Loss:     0.637767 Tokens per Sec:    13720, Lr: 0.000103\n",
            "2019-12-31 08:13:28,823 Epoch 100 Step:    26900 Batch Loss:     0.532974 Tokens per Sec:    13263, Lr: 0.000103\n",
            "2019-12-31 08:13:43,810 Epoch 100 Step:    27000 Batch Loss:     0.383115 Tokens per Sec:    13737, Lr: 0.000103\n",
            "2019-12-31 08:14:17,614 Example #0\n",
            "2019-12-31 08:14:17,614 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 08:14:17,614 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 08:14:17,614 \tHypothesis: Ihoho nana nẹrhẹ ayen se muegbe rẹ ubiudu rayen , ayen me je nabọ muegbe rẹ oborẹ avwanre che ru .\n",
            "2019-12-31 08:14:17,614 Example #1\n",
            "2019-12-31 08:14:17,614 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 08:14:17,614 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 08:14:17,614 \tHypothesis: Asaọkiephana , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 08:14:17,614 Example #2\n",
            "2019-12-31 08:14:17,615 \tSource:     But freedom from what ?\n",
            "2019-12-31 08:14:17,615 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 08:14:17,615 \tHypothesis: Kẹ egbomọphẹ vọ yen o vwo ruo ?\n",
            "2019-12-31 08:14:17,615 Example #3\n",
            "2019-12-31 08:14:17,615 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 08:14:17,615 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 08:14:17,615 \tHypothesis: Wọ je guọnọ nẹ ukoko wẹn yovwin nọ wẹ nure .\n",
            "2019-12-31 08:14:17,615 Validation result (greedy) at epoch 100, step    27000: bleu:  15.06, loss: 62960.0586, ppl:  20.2990, duration: 33.8048s\n",
            "2019-12-31 08:14:20,675 Epoch 100: total training loss 195.52\n",
            "2019-12-31 08:14:20,675 EPOCH 101\n",
            "2019-12-31 08:14:32,553 Epoch 101 Step:    27100 Batch Loss:     0.880407 Tokens per Sec:    13567, Lr: 0.000103\n",
            "2019-12-31 08:14:47,508 Epoch 101 Step:    27200 Batch Loss:     0.542491 Tokens per Sec:    13770, Lr: 0.000103\n",
            "2019-12-31 08:15:01,034 Epoch 101: total training loss 193.58\n",
            "2019-12-31 08:15:01,034 EPOCH 102\n",
            "2019-12-31 08:15:02,403 Epoch 102 Step:    27300 Batch Loss:     0.871403 Tokens per Sec:    13309, Lr: 0.000103\n",
            "2019-12-31 08:15:17,348 Epoch 102 Step:    27400 Batch Loss:     0.793331 Tokens per Sec:    13614, Lr: 0.000103\n",
            "2019-12-31 08:15:32,176 Epoch 102 Step:    27500 Batch Loss:     0.909883 Tokens per Sec:    13253, Lr: 0.000103\n",
            "2019-12-31 08:15:41,366 Epoch 102: total training loss 192.21\n",
            "2019-12-31 08:15:41,366 EPOCH 103\n",
            "2019-12-31 08:15:47,148 Epoch 103 Step:    27600 Batch Loss:     0.856236 Tokens per Sec:    13411, Lr: 0.000103\n",
            "2019-12-31 08:16:02,063 Epoch 103 Step:    27700 Batch Loss:     0.708610 Tokens per Sec:    13737, Lr: 0.000103\n",
            "2019-12-31 08:16:16,996 Epoch 103 Step:    27800 Batch Loss:     0.633424 Tokens per Sec:    13478, Lr: 0.000103\n",
            "2019-12-31 08:16:21,740 Epoch 103: total training loss 191.63\n",
            "2019-12-31 08:16:21,741 EPOCH 104\n",
            "2019-12-31 08:16:31,880 Epoch 104 Step:    27900 Batch Loss:     0.491745 Tokens per Sec:    13300, Lr: 0.000103\n",
            "2019-12-31 08:16:46,706 Epoch 104 Step:    28000 Batch Loss:     0.372040 Tokens per Sec:    13779, Lr: 0.000103\n",
            "2019-12-31 08:17:20,419 Example #0\n",
            "2019-12-31 08:17:20,419 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 08:17:20,420 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 08:17:20,420 \tHypothesis: Ihoho nana nẹrhẹ ayen se muegbe rẹ ubiudu rayen , ayen me je nene odjekẹ rẹ avwanre .\n",
            "2019-12-31 08:17:20,420 Example #1\n",
            "2019-12-31 08:17:20,420 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 08:17:20,420 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 08:17:20,420 \tHypothesis: Asaọkiephana , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 08:17:20,420 Example #2\n",
            "2019-12-31 08:17:20,420 \tSource:     But freedom from what ?\n",
            "2019-12-31 08:17:20,420 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 08:17:20,420 \tHypothesis: Kẹ egbomọphẹ vọ yen o vwo ruo ?\n",
            "2019-12-31 08:17:20,420 Example #3\n",
            "2019-12-31 08:17:20,420 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 08:17:20,420 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 08:17:20,420 \tHypothesis: Wọ je guọnọ nẹ ukoko wẹn yovwin nọ wẹ - ẹ .\n",
            "2019-12-31 08:17:20,420 Validation result (greedy) at epoch 104, step    28000: bleu:  15.01, loss: 63579.7070, ppl:  20.9094, duration: 33.7139s\n",
            "2019-12-31 08:17:35,252 Epoch 104 Step:    28100 Batch Loss:     0.861210 Tokens per Sec:    13773, Lr: 0.000103\n",
            "2019-12-31 08:17:35,401 Epoch 104: total training loss 189.86\n",
            "2019-12-31 08:17:35,402 EPOCH 105\n",
            "2019-12-31 08:17:50,225 Epoch 105 Step:    28200 Batch Loss:     0.928001 Tokens per Sec:    13623, Lr: 0.000103\n",
            "2019-12-31 08:18:05,015 Epoch 105 Step:    28300 Batch Loss:     0.688904 Tokens per Sec:    13628, Lr: 0.000103\n",
            "2019-12-31 08:18:15,385 Epoch 105: total training loss 189.11\n",
            "2019-12-31 08:18:15,385 EPOCH 106\n",
            "2019-12-31 08:18:19,859 Epoch 106 Step:    28400 Batch Loss:     0.750112 Tokens per Sec:    12831, Lr: 0.000103\n",
            "2019-12-31 08:18:34,721 Epoch 106 Step:    28500 Batch Loss:     0.744182 Tokens per Sec:    13755, Lr: 0.000103\n",
            "2019-12-31 08:18:49,336 Epoch 106 Step:    28600 Batch Loss:     0.835842 Tokens per Sec:    13869, Lr: 0.000103\n",
            "2019-12-31 08:18:55,260 Epoch 106: total training loss 188.94\n",
            "2019-12-31 08:18:55,261 EPOCH 107\n",
            "2019-12-31 08:19:04,177 Epoch 107 Step:    28700 Batch Loss:     0.718878 Tokens per Sec:    13587, Lr: 0.000103\n",
            "2019-12-31 08:19:18,887 Epoch 107 Step:    28800 Batch Loss:     0.802731 Tokens per Sec:    13650, Lr: 0.000103\n",
            "2019-12-31 08:19:33,537 Epoch 107 Step:    28900 Batch Loss:     0.671088 Tokens per Sec:    13860, Lr: 0.000103\n",
            "2019-12-31 08:19:35,021 Epoch 107: total training loss 187.64\n",
            "2019-12-31 08:19:35,021 EPOCH 108\n",
            "2019-12-31 08:19:48,220 Epoch 108 Step:    29000 Batch Loss:     0.744159 Tokens per Sec:    14015, Lr: 0.000103\n",
            "2019-12-31 08:20:21,944 Example #0\n",
            "2019-12-31 08:20:21,945 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 08:20:21,945 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 08:20:21,945 \tHypothesis: Ihoho nana nẹrhẹ ihwo roro nẹ ayen che muegbe rẹ ubiudu avwanre , je nene odjekẹ rẹ avwanre .\n",
            "2019-12-31 08:20:21,945 Example #1\n",
            "2019-12-31 08:20:21,945 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 08:20:21,945 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 08:20:21,945 \tHypothesis: Asaọkiephana , ọ ga vwẹ Bẹtẹl asaọkiephana .\n",
            "2019-12-31 08:20:21,945 Example #2\n",
            "2019-12-31 08:20:21,945 \tSource:     But freedom from what ?\n",
            "2019-12-31 08:20:21,945 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 08:20:21,946 \tHypothesis: Die kọyen egbomọphẹ ?\n",
            "2019-12-31 08:20:21,946 Example #3\n",
            "2019-12-31 08:20:21,946 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 08:20:21,946 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 08:20:21,946 \tHypothesis: Ọ je hepha kẹ ukoko wẹn vẹ ukoko wẹn ro vwo oka rẹ ovwan na ọfa - a .\n",
            "2019-12-31 08:20:21,946 Validation result (greedy) at epoch 108, step    29000: bleu:  14.89, loss: 63843.8672, ppl:  21.1752, duration: 33.7256s\n",
            "2019-12-31 08:20:36,682 Epoch 108 Step:    29100 Batch Loss:     0.891153 Tokens per Sec:    13388, Lr: 0.000103\n",
            "2019-12-31 08:20:48,392 Epoch 108: total training loss 187.12\n",
            "2019-12-31 08:20:48,392 EPOCH 109\n",
            "2019-12-31 08:20:51,382 Epoch 109 Step:    29200 Batch Loss:     0.728129 Tokens per Sec:    14000, Lr: 0.000103\n",
            "2019-12-31 08:21:06,115 Epoch 109 Step:    29300 Batch Loss:     0.619693 Tokens per Sec:    13408, Lr: 0.000103\n",
            "2019-12-31 08:21:20,918 Epoch 109 Step:    29400 Batch Loss:     0.911039 Tokens per Sec:    13772, Lr: 0.000103\n",
            "2019-12-31 08:21:28,466 Epoch 109: total training loss 187.66\n",
            "2019-12-31 08:21:28,466 EPOCH 110\n",
            "2019-12-31 08:21:35,511 Epoch 110 Step:    29500 Batch Loss:     0.824383 Tokens per Sec:    13307, Lr: 0.000103\n",
            "2019-12-31 08:21:50,191 Epoch 110 Step:    29600 Batch Loss:     0.877905 Tokens per Sec:    14070, Lr: 0.000103\n",
            "2019-12-31 08:22:04,899 Epoch 110 Step:    29700 Batch Loss:     0.511252 Tokens per Sec:    13970, Lr: 0.000103\n",
            "2019-12-31 08:22:08,037 Epoch 110: total training loss 185.65\n",
            "2019-12-31 08:22:08,037 EPOCH 111\n",
            "2019-12-31 08:22:19,345 Epoch 111 Step:    29800 Batch Loss:     0.817970 Tokens per Sec:    13773, Lr: 0.000103\n",
            "2019-12-31 08:22:33,970 Epoch 111 Step:    29900 Batch Loss:     0.541562 Tokens per Sec:    13603, Lr: 0.000103\n",
            "2019-12-31 08:22:47,925 Epoch 111: total training loss 186.77\n",
            "2019-12-31 08:22:47,925 EPOCH 112\n",
            "2019-12-31 08:22:48,548 Epoch 112 Step:    30000 Batch Loss:     0.565645 Tokens per Sec:    11169, Lr: 0.000103\n",
            "2019-12-31 08:23:22,160 Example #0\n",
            "2019-12-31 08:23:22,161 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 08:23:22,161 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 08:23:22,161 \tHypothesis: Ihoho nana nẹrhẹ e se muegbe rẹ ubiudu avwanre , ayen me je nabọ muegbe rẹ oborẹ avwanre che ru .\n",
            "2019-12-31 08:23:22,161 Example #1\n",
            "2019-12-31 08:23:22,161 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 08:23:22,161 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 08:23:22,161 \tHypothesis: Enẹna , ọ ga vwẹ Bẹtẹl asaọkiephana .\n",
            "2019-12-31 08:23:22,161 Example #2\n",
            "2019-12-31 08:23:22,161 \tSource:     But freedom from what ?\n",
            "2019-12-31 08:23:22,161 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 08:23:22,161 \tHypothesis: Die yen egbomọphẹ ?\n",
            "2019-12-31 08:23:22,161 Example #3\n",
            "2019-12-31 08:23:22,161 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 08:23:22,161 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 08:23:22,162 \tHypothesis: Ọ je dianẹ ovwan je yan obaro vwẹ ukoko wẹn ọfa - a .\n",
            "2019-12-31 08:23:22,162 Validation result (greedy) at epoch 112, step    30000: bleu:  14.17, loss: 63885.1172, ppl:  21.2170, duration: 33.6139s\n",
            "2019-12-31 08:23:36,850 Epoch 112 Step:    30100 Batch Loss:     0.451706 Tokens per Sec:    13599, Lr: 0.000103\n",
            "2019-12-31 08:23:51,510 Epoch 112 Step:    30200 Batch Loss:     0.474423 Tokens per Sec:    14008, Lr: 0.000103\n",
            "2019-12-31 08:24:01,170 Epoch 112: total training loss 184.29\n",
            "2019-12-31 08:24:01,170 EPOCH 113\n",
            "2019-12-31 08:24:06,296 Epoch 113 Step:    30300 Batch Loss:     0.697271 Tokens per Sec:    13793, Lr: 0.000103\n",
            "2019-12-31 08:24:20,758 Epoch 113 Step:    30400 Batch Loss:     0.743529 Tokens per Sec:    13803, Lr: 0.000103\n",
            "2019-12-31 08:24:35,331 Epoch 113 Step:    30500 Batch Loss:     0.725941 Tokens per Sec:    13680, Lr: 0.000103\n",
            "2019-12-31 08:24:40,937 Epoch 113: total training loss 183.82\n",
            "2019-12-31 08:24:40,937 EPOCH 114\n",
            "2019-12-31 08:24:50,100 Epoch 114 Step:    30600 Batch Loss:     0.692801 Tokens per Sec:    14110, Lr: 0.000103\n",
            "2019-12-31 08:25:04,741 Epoch 114 Step:    30700 Batch Loss:     0.356915 Tokens per Sec:    13379, Lr: 0.000103\n",
            "2019-12-31 08:25:19,367 Epoch 114 Step:    30800 Batch Loss:     0.715890 Tokens per Sec:    13855, Lr: 0.000103\n",
            "2019-12-31 08:25:20,565 Epoch 114: total training loss 181.90\n",
            "2019-12-31 08:25:20,565 EPOCH 115\n",
            "2019-12-31 08:25:33,996 Epoch 115 Step:    30900 Batch Loss:     0.760353 Tokens per Sec:    13800, Lr: 0.000103\n",
            "2019-12-31 08:25:48,510 Epoch 115 Step:    31000 Batch Loss:     0.846082 Tokens per Sec:    13814, Lr: 0.000103\n",
            "2019-12-31 08:26:22,152 Example #0\n",
            "2019-12-31 08:26:22,153 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 08:26:22,153 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 08:26:22,153 \tHypothesis: Ihoho nana nẹrhẹ ihwo roro nẹ ayen che muegbe rẹ ubiudu avwanre , je nene odjekẹ rẹ avwanre .\n",
            "2019-12-31 08:26:22,153 Example #1\n",
            "2019-12-31 08:26:22,153 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 08:26:22,153 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 08:26:22,153 \tHypothesis: Asaọkiephana , ọ ga vwẹ Bẹtẹl asaọkiephana .\n",
            "2019-12-31 08:26:22,153 Example #2\n",
            "2019-12-31 08:26:22,153 \tSource:     But freedom from what ?\n",
            "2019-12-31 08:26:22,153 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 08:26:22,153 \tHypothesis: Die yen egbomọphẹ ?\n",
            "2019-12-31 08:26:22,153 Example #3\n",
            "2019-12-31 08:26:22,154 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 08:26:22,154 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 08:26:22,154 \tHypothesis: Ọ je hepha kẹ ukoko wẹn jovwo - o .\n",
            "2019-12-31 08:26:22,154 Validation result (greedy) at epoch 115, step    31000: bleu:  14.80, loss: 64090.2539, ppl:  21.4262, duration: 33.6435s\n",
            "2019-12-31 08:26:33,461 Epoch 115: total training loss 179.71\n",
            "2019-12-31 08:26:33,461 EPOCH 116\n",
            "2019-12-31 08:26:36,880 Epoch 116 Step:    31100 Batch Loss:     0.547474 Tokens per Sec:    13448, Lr: 0.000072\n",
            "2019-12-31 08:26:51,474 Epoch 116 Step:    31200 Batch Loss:     0.625631 Tokens per Sec:    13932, Lr: 0.000072\n",
            "2019-12-31 08:27:06,060 Epoch 116 Step:    31300 Batch Loss:     0.768800 Tokens per Sec:    13774, Lr: 0.000072\n",
            "2019-12-31 08:27:12,989 Epoch 116: total training loss 178.01\n",
            "2019-12-31 08:27:12,989 EPOCH 117\n",
            "2019-12-31 08:27:20,635 Epoch 117 Step:    31400 Batch Loss:     0.576995 Tokens per Sec:    13612, Lr: 0.000072\n",
            "2019-12-31 08:27:35,302 Epoch 117 Step:    31500 Batch Loss:     0.813073 Tokens per Sec:    13943, Lr: 0.000072\n",
            "2019-12-31 08:27:49,812 Epoch 117 Step:    31600 Batch Loss:     0.714269 Tokens per Sec:    13863, Lr: 0.000072\n",
            "2019-12-31 08:27:52,404 Epoch 117: total training loss 176.84\n",
            "2019-12-31 08:27:52,404 EPOCH 118\n",
            "2019-12-31 08:28:04,401 Epoch 118 Step:    31700 Batch Loss:     0.583710 Tokens per Sec:    13717, Lr: 0.000072\n",
            "2019-12-31 08:28:18,978 Epoch 118 Step:    31800 Batch Loss:     0.643689 Tokens per Sec:    13814, Lr: 0.000072\n",
            "2019-12-31 08:28:31,886 Epoch 118: total training loss 176.23\n",
            "2019-12-31 08:28:31,886 EPOCH 119\n",
            "2019-12-31 08:28:33,650 Epoch 119 Step:    31900 Batch Loss:     0.445811 Tokens per Sec:    13144, Lr: 0.000072\n",
            "2019-12-31 08:28:48,137 Epoch 119 Step:    32000 Batch Loss:     0.812937 Tokens per Sec:    13998, Lr: 0.000072\n",
            "2019-12-31 08:29:21,645 Example #0\n",
            "2019-12-31 08:29:21,645 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 08:29:21,645 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 08:29:21,645 \tHypothesis: Ihoho nana toroba oborẹ ubiudu avwanre se vwo muegbe rẹ ayen che vwo ru nene .\n",
            "2019-12-31 08:29:21,645 Example #1\n",
            "2019-12-31 08:29:21,646 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 08:29:21,646 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 08:29:21,646 \tHypothesis: Asaọkiephana , ọ ga vwẹ Bẹtẹl asaọkiephana .\n",
            "2019-12-31 08:29:21,646 Example #2\n",
            "2019-12-31 08:29:21,646 \tSource:     But freedom from what ?\n",
            "2019-12-31 08:29:21,646 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 08:29:21,646 \tHypothesis: Kẹ egbomọphẹ vọ yen o vwo ?\n",
            "2019-12-31 08:29:21,646 Example #3\n",
            "2019-12-31 08:29:21,646 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 08:29:21,646 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 08:29:21,646 \tHypothesis: Ọ je hepha kẹ ukoko wẹn ri jovwo na , wo ji no ọtiọye - en .\n",
            "2019-12-31 08:29:21,646 Validation result (greedy) at epoch 119, step    32000: bleu:  15.04, loss: 64068.0273, ppl:  21.4034, duration: 33.5093s\n",
            "2019-12-31 08:29:36,081 Epoch 119 Step:    32100 Batch Loss:     0.580997 Tokens per Sec:    13755, Lr: 0.000072\n",
            "2019-12-31 08:29:44,696 Epoch 119: total training loss 176.20\n",
            "2019-12-31 08:29:44,696 EPOCH 120\n",
            "2019-12-31 08:29:50,684 Epoch 120 Step:    32200 Batch Loss:     0.782154 Tokens per Sec:    13827, Lr: 0.000072\n",
            "2019-12-31 08:30:05,160 Epoch 120 Step:    32300 Batch Loss:     0.705442 Tokens per Sec:    13865, Lr: 0.000072\n",
            "2019-12-31 08:30:19,643 Epoch 120 Step:    32400 Batch Loss:     0.641414 Tokens per Sec:    13829, Lr: 0.000072\n",
            "2019-12-31 08:30:24,000 Epoch 120: total training loss 175.48\n",
            "2019-12-31 08:30:24,000 EPOCH 121\n",
            "2019-12-31 08:30:34,179 Epoch 121 Step:    32500 Batch Loss:     0.456334 Tokens per Sec:    13758, Lr: 0.000072\n",
            "2019-12-31 08:30:48,627 Epoch 121 Step:    32600 Batch Loss:     0.797962 Tokens per Sec:    14015, Lr: 0.000072\n",
            "2019-12-31 08:31:03,008 Epoch 121 Step:    32700 Batch Loss:     0.701287 Tokens per Sec:    13838, Lr: 0.000072\n",
            "2019-12-31 08:31:03,303 Epoch 121: total training loss 174.83\n",
            "2019-12-31 08:31:03,303 EPOCH 122\n",
            "2019-12-31 08:31:17,798 Epoch 122 Step:    32800 Batch Loss:     0.788418 Tokens per Sec:    13909, Lr: 0.000072\n",
            "2019-12-31 08:31:32,423 Epoch 122 Step:    32900 Batch Loss:     0.700216 Tokens per Sec:    13811, Lr: 0.000072\n",
            "2019-12-31 08:31:42,642 Epoch 122: total training loss 172.71\n",
            "2019-12-31 08:31:42,642 EPOCH 123\n",
            "2019-12-31 08:31:46,826 Epoch 123 Step:    33000 Batch Loss:     0.682884 Tokens per Sec:    14435, Lr: 0.000072\n",
            "2019-12-31 08:32:20,349 Example #0\n",
            "2019-12-31 08:32:20,350 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 08:32:20,350 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 08:32:20,350 \tHypothesis: Ihoho nana toroba obo rehẹ ubiudu rẹ avwanre , kidie ayen muegbe rẹ ayen vwo nene ọrhuẹrẹphiyotọ na .\n",
            "2019-12-31 08:32:20,350 Example #1\n",
            "2019-12-31 08:32:20,350 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 08:32:20,350 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 08:32:20,350 \tHypothesis: Asaọkiephana , ọ ga vwẹ Bẹtẹl asaọkiephana .\n",
            "2019-12-31 08:32:20,350 Example #2\n",
            "2019-12-31 08:32:20,350 \tSource:     But freedom from what ?\n",
            "2019-12-31 08:32:20,350 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 08:32:20,350 \tHypothesis: Kẹ egbomọphẹ vọ yen o vwo ?\n",
            "2019-12-31 08:32:20,350 Example #3\n",
            "2019-12-31 08:32:20,350 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 08:32:20,350 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 08:32:20,350 \tHypothesis: Wọ je guọnọ nẹ ukoko wẹn ro vwo oka rẹ ovwan na ọfa - a .\n",
            "2019-12-31 08:32:20,350 Validation result (greedy) at epoch 123, step    33000: bleu:  15.01, loss: 64300.8086, ppl:  21.6430, duration: 33.5238s\n",
            "2019-12-31 08:32:34,814 Epoch 123 Step:    33100 Batch Loss:     0.708807 Tokens per Sec:    13836, Lr: 0.000072\n",
            "2019-12-31 08:32:49,398 Epoch 123 Step:    33200 Batch Loss:     0.482665 Tokens per Sec:    14072, Lr: 0.000072\n",
            "2019-12-31 08:32:55,234 Epoch 123: total training loss 172.43\n",
            "2019-12-31 08:32:55,234 EPOCH 124\n",
            "2019-12-31 08:33:03,927 Epoch 124 Step:    33300 Batch Loss:     0.745029 Tokens per Sec:    14101, Lr: 0.000072\n",
            "2019-12-31 08:33:18,480 Epoch 124 Step:    33400 Batch Loss:     0.749112 Tokens per Sec:    13505, Lr: 0.000072\n",
            "2019-12-31 08:33:33,032 Epoch 124 Step:    33500 Batch Loss:     0.645195 Tokens per Sec:    13862, Lr: 0.000072\n",
            "2019-12-31 08:33:34,660 Epoch 124: total training loss 172.04\n",
            "2019-12-31 08:33:34,660 EPOCH 125\n",
            "2019-12-31 08:33:47,673 Epoch 125 Step:    33600 Batch Loss:     0.725643 Tokens per Sec:    13701, Lr: 0.000072\n",
            "2019-12-31 08:34:02,083 Epoch 125 Step:    33700 Batch Loss:     0.329218 Tokens per Sec:    13977, Lr: 0.000072\n",
            "2019-12-31 08:34:13,869 Epoch 125: total training loss 172.38\n",
            "2019-12-31 08:34:13,869 EPOCH 126\n",
            "2019-12-31 08:34:16,570 Epoch 126 Step:    33800 Batch Loss:     0.455275 Tokens per Sec:    14614, Lr: 0.000072\n",
            "2019-12-31 08:34:30,970 Epoch 126 Step:    33900 Batch Loss:     0.687133 Tokens per Sec:    13738, Lr: 0.000072\n",
            "2019-12-31 08:34:45,472 Epoch 126 Step:    34000 Batch Loss:     0.614258 Tokens per Sec:    14191, Lr: 0.000072\n",
            "2019-12-31 08:35:19,016 Example #0\n",
            "2019-12-31 08:35:19,017 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 08:35:19,017 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 08:35:19,017 \tHypothesis: Ihoho nana nẹrhẹ ihwo roro nẹ ayen che muegbe rẹ ubiudu avwanre , je nene odjekẹ rẹ avwanre .\n",
            "2019-12-31 08:35:19,017 Example #1\n",
            "2019-12-31 08:35:19,017 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 08:35:19,017 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 08:35:19,017 \tHypothesis: Asaọkiephana , ọ ga vwẹ Bẹtẹl asaọkiephana .\n",
            "2019-12-31 08:35:19,017 Example #2\n",
            "2019-12-31 08:35:19,017 \tSource:     But freedom from what ?\n",
            "2019-12-31 08:35:19,017 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 08:35:19,018 \tHypothesis: Die kọyen egbomọphẹ ?\n",
            "2019-12-31 08:35:19,018 Example #3\n",
            "2019-12-31 08:35:19,018 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 08:35:19,018 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 08:35:19,018 \tHypothesis: Ọ je hepha kẹ ukoko wẹn jovwo , wo ji no nẹ ukoko wẹn yovwin nọ - ọ .\n",
            "2019-12-31 08:35:19,018 Validation result (greedy) at epoch 126, step    34000: bleu:  14.89, loss: 64409.6836, ppl:  21.7559, duration: 33.5455s\n",
            "2019-12-31 08:35:26,814 Epoch 126: total training loss 171.41\n",
            "2019-12-31 08:35:26,814 EPOCH 127\n",
            "2019-12-31 08:35:33,465 Epoch 127 Step:    34100 Batch Loss:     0.470043 Tokens per Sec:    13709, Lr: 0.000072\n",
            "2019-12-31 08:35:47,978 Epoch 127 Step:    34200 Batch Loss:     0.549079 Tokens per Sec:    13924, Lr: 0.000072\n",
            "2019-12-31 08:36:02,424 Epoch 127 Step:    34300 Batch Loss:     0.335187 Tokens per Sec:    14004, Lr: 0.000072\n",
            "2019-12-31 08:36:06,125 Epoch 127: total training loss 171.47\n",
            "2019-12-31 08:36:06,126 EPOCH 128\n",
            "2019-12-31 08:36:16,903 Epoch 128 Step:    34400 Batch Loss:     0.479019 Tokens per Sec:    13803, Lr: 0.000072\n",
            "2019-12-31 08:36:31,386 Epoch 128 Step:    34500 Batch Loss:     0.529685 Tokens per Sec:    13881, Lr: 0.000072\n",
            "2019-12-31 08:36:45,419 Epoch 128: total training loss 170.12\n",
            "2019-12-31 08:36:45,420 EPOCH 129\n",
            "2019-12-31 08:36:45,897 Epoch 129 Step:    34600 Batch Loss:     0.621966 Tokens per Sec:    10921, Lr: 0.000072\n",
            "2019-12-31 08:37:00,417 Epoch 129 Step:    34700 Batch Loss:     0.296525 Tokens per Sec:    13966, Lr: 0.000072\n",
            "2019-12-31 08:37:14,951 Epoch 129 Step:    34800 Batch Loss:     0.709406 Tokens per Sec:    14071, Lr: 0.000072\n",
            "2019-12-31 08:37:24,555 Epoch 129: total training loss 169.07\n",
            "2019-12-31 08:37:24,555 EPOCH 130\n",
            "2019-12-31 08:37:29,425 Epoch 130 Step:    34900 Batch Loss:     0.679545 Tokens per Sec:    14058, Lr: 0.000072\n",
            "2019-12-31 08:37:43,824 Epoch 130 Step:    35000 Batch Loss:     0.619538 Tokens per Sec:    13891, Lr: 0.000072\n",
            "2019-12-31 08:38:17,317 Example #0\n",
            "2019-12-31 08:38:17,318 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 08:38:17,318 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 08:38:17,318 \tHypothesis: Ihoho nana toroba obo rehẹ ubiudu rayen , kidie ayen muegbe rẹ ayen vwo nene odjekẹ na .\n",
            "2019-12-31 08:38:17,318 Example #1\n",
            "2019-12-31 08:38:17,318 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 08:38:17,318 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 08:38:17,318 \tHypothesis: Asaọkiephana , ọ ga vwẹ Bẹtẹl asaọkiephana .\n",
            "2019-12-31 08:38:17,318 Example #2\n",
            "2019-12-31 08:38:17,318 \tSource:     But freedom from what ?\n",
            "2019-12-31 08:38:17,318 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 08:38:17,318 \tHypothesis: Die kọyen egbomọphẹ ?\n",
            "2019-12-31 08:38:17,318 Example #3\n",
            "2019-12-31 08:38:17,318 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 08:38:17,318 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 08:38:17,319 \tHypothesis: Ọ je hepha kẹ ukoko wẹn jovwo , wo ji no ọtiọye - en .\n",
            "2019-12-31 08:38:17,319 Validation result (greedy) at epoch 130, step    35000: bleu:  14.92, loss: 64785.3125, ppl:  22.1502, duration: 33.4939s\n",
            "2019-12-31 08:38:31,732 Epoch 130 Step:    35100 Batch Loss:     0.425837 Tokens per Sec:    14068, Lr: 0.000072\n",
            "2019-12-31 08:38:37,186 Epoch 130: total training loss 168.85\n",
            "2019-12-31 08:38:37,186 EPOCH 131\n",
            "2019-12-31 08:38:46,263 Epoch 131 Step:    35200 Batch Loss:     0.675949 Tokens per Sec:    13969, Lr: 0.000072\n",
            "2019-12-31 08:39:00,906 Epoch 131 Step:    35300 Batch Loss:     0.569329 Tokens per Sec:    13871, Lr: 0.000072\n",
            "2019-12-31 08:39:15,401 Epoch 131 Step:    35400 Batch Loss:     0.749621 Tokens per Sec:    13825, Lr: 0.000072\n",
            "2019-12-31 08:39:16,440 Epoch 131: total training loss 167.70\n",
            "2019-12-31 08:39:16,441 EPOCH 132\n",
            "2019-12-31 08:39:29,939 Epoch 132 Step:    35500 Batch Loss:     0.570535 Tokens per Sec:    13893, Lr: 0.000072\n",
            "2019-12-31 08:39:44,448 Epoch 132 Step:    35600 Batch Loss:     0.694527 Tokens per Sec:    14036, Lr: 0.000072\n",
            "2019-12-31 08:39:55,453 Epoch 132: total training loss 167.57\n",
            "2019-12-31 08:39:55,453 EPOCH 133\n",
            "2019-12-31 08:39:58,948 Epoch 133 Step:    35700 Batch Loss:     0.670402 Tokens per Sec:    13051, Lr: 0.000072\n",
            "2019-12-31 08:40:13,536 Epoch 133 Step:    35800 Batch Loss:     0.434567 Tokens per Sec:    14156, Lr: 0.000072\n",
            "2019-12-31 08:40:28,058 Epoch 133 Step:    35900 Batch Loss:     0.658459 Tokens per Sec:    14040, Lr: 0.000072\n",
            "2019-12-31 08:40:34,659 Epoch 133: total training loss 167.31\n",
            "2019-12-31 08:40:34,659 EPOCH 134\n",
            "2019-12-31 08:40:42,449 Epoch 134 Step:    36000 Batch Loss:     0.556847 Tokens per Sec:    13505, Lr: 0.000072\n",
            "2019-12-31 08:41:15,959 Example #0\n",
            "2019-12-31 08:41:15,960 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 08:41:15,960 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 08:41:15,960 \tHypothesis: Omaẹkparọ yen nẹrhẹ ihwo efa muegbe rẹ ayen vwo muegbe rẹ obo rehẹ ubiudu avwanre , je nene odjekẹ na .\n",
            "2019-12-31 08:41:15,960 Example #1\n",
            "2019-12-31 08:41:15,960 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 08:41:15,960 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 08:41:15,960 \tHypothesis: Enẹna , ọ ga vwẹ Bẹtẹl asaọkiephana .\n",
            "2019-12-31 08:41:15,960 Example #2\n",
            "2019-12-31 08:41:15,960 \tSource:     But freedom from what ?\n",
            "2019-12-31 08:41:15,960 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 08:41:15,960 \tHypothesis: Die kọyen egbomọphẹ ?\n",
            "2019-12-31 08:41:15,960 Example #3\n",
            "2019-12-31 08:41:15,961 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 08:41:15,961 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 08:41:15,961 \tHypothesis: Wọ je guọnọ nẹ ukoko wẹn ro vwo oka rẹ ovwan na ọfa - a .\n",
            "2019-12-31 08:41:15,961 Validation result (greedy) at epoch 134, step    36000: bleu:  14.50, loss: 64826.3672, ppl:  22.1938, duration: 33.5113s\n",
            "2019-12-31 08:41:30,509 Epoch 134 Step:    36100 Batch Loss:     0.788069 Tokens per Sec:    14193, Lr: 0.000072\n",
            "2019-12-31 08:41:44,908 Epoch 134 Step:    36200 Batch Loss:     0.550192 Tokens per Sec:    13956, Lr: 0.000072\n",
            "2019-12-31 08:41:47,342 Epoch 134: total training loss 167.40\n",
            "2019-12-31 08:41:47,342 EPOCH 135\n",
            "2019-12-31 08:41:59,600 Epoch 135 Step:    36300 Batch Loss:     0.454963 Tokens per Sec:    13954, Lr: 0.000072\n",
            "2019-12-31 08:42:14,033 Epoch 135 Step:    36400 Batch Loss:     0.555912 Tokens per Sec:    13706, Lr: 0.000072\n",
            "2019-12-31 08:42:26,802 Epoch 135: total training loss 166.62\n",
            "2019-12-31 08:42:26,803 EPOCH 136\n",
            "2019-12-31 08:42:28,590 Epoch 136 Step:    36500 Batch Loss:     0.545697 Tokens per Sec:    13688, Lr: 0.000072\n",
            "2019-12-31 08:42:43,132 Epoch 136 Step:    36600 Batch Loss:     0.414271 Tokens per Sec:    13848, Lr: 0.000072\n",
            "2019-12-31 08:42:57,747 Epoch 136 Step:    36700 Batch Loss:     0.489893 Tokens per Sec:    14059, Lr: 0.000072\n",
            "2019-12-31 08:43:06,055 Epoch 136: total training loss 164.71\n",
            "2019-12-31 08:43:06,055 EPOCH 137\n",
            "2019-12-31 08:43:12,283 Epoch 137 Step:    36800 Batch Loss:     0.737937 Tokens per Sec:    13703, Lr: 0.000072\n",
            "2019-12-31 08:43:26,841 Epoch 137 Step:    36900 Batch Loss:     0.663248 Tokens per Sec:    14081, Lr: 0.000072\n",
            "2019-12-31 08:43:41,102 Epoch 137 Step:    37000 Batch Loss:     0.484364 Tokens per Sec:    13674, Lr: 0.000072\n",
            "2019-12-31 08:44:14,580 Example #0\n",
            "2019-12-31 08:44:14,580 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 08:44:14,581 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 08:44:14,581 \tHypothesis: Omaẹkparọ yen nẹrhẹ ihwo ni ubiudu rayen ghanghanre , ayen me je nabọ muegbe rẹ oborẹ avwanre che nene ọrhuẹrẹphiyotọ na .\n",
            "2019-12-31 08:44:14,581 Example #1\n",
            "2019-12-31 08:44:14,581 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 08:44:14,581 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 08:44:14,581 \tHypothesis: Asaọkiephana , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 08:44:14,581 Example #2\n",
            "2019-12-31 08:44:14,581 \tSource:     But freedom from what ?\n",
            "2019-12-31 08:44:14,581 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 08:44:14,581 \tHypothesis: Kẹ egbomọphẹ vọ yen o vwo ruo ?\n",
            "2019-12-31 08:44:14,581 Example #3\n",
            "2019-12-31 08:44:14,581 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 08:44:14,581 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 08:44:14,581 \tHypothesis: Ọ je dianẹ ovwan vẹ ukoko wẹn ro vwo ẹghwọ na - a .\n",
            "2019-12-31 08:44:14,581 Validation result (greedy) at epoch 137, step    37000: bleu:  14.71, loss: 65046.7500, ppl:  22.4289, duration: 33.4795s\n",
            "2019-12-31 08:44:18,653 Epoch 137: total training loss 165.17\n",
            "2019-12-31 08:44:18,653 EPOCH 138\n",
            "2019-12-31 08:44:29,148 Epoch 138 Step:    37100 Batch Loss:     0.660452 Tokens per Sec:    13988, Lr: 0.000050\n",
            "2019-12-31 08:44:43,642 Epoch 138 Step:    37200 Batch Loss:     0.658041 Tokens per Sec:    14156, Lr: 0.000050\n",
            "2019-12-31 08:44:57,747 Epoch 138: total training loss 162.98\n",
            "2019-12-31 08:44:57,747 EPOCH 139\n",
            "2019-12-31 08:44:58,080 Epoch 139 Step:    37300 Batch Loss:     0.659146 Tokens per Sec:    13309, Lr: 0.000050\n",
            "2019-12-31 08:45:12,701 Epoch 139 Step:    37400 Batch Loss:     0.622420 Tokens per Sec:    13942, Lr: 0.000050\n",
            "2019-12-31 08:45:27,176 Epoch 139 Step:    37500 Batch Loss:     0.640145 Tokens per Sec:    13825, Lr: 0.000050\n",
            "2019-12-31 08:45:37,097 Epoch 139: total training loss 161.53\n",
            "2019-12-31 08:45:37,098 EPOCH 140\n",
            "2019-12-31 08:45:41,881 Epoch 140 Step:    37600 Batch Loss:     0.494648 Tokens per Sec:    14162, Lr: 0.000050\n",
            "2019-12-31 08:45:56,479 Epoch 140 Step:    37700 Batch Loss:     0.602178 Tokens per Sec:    14065, Lr: 0.000050\n",
            "2019-12-31 08:46:11,004 Epoch 140 Step:    37800 Batch Loss:     0.734182 Tokens per Sec:    13795, Lr: 0.000050\n",
            "2019-12-31 08:46:16,137 Epoch 140: total training loss 160.72\n",
            "2019-12-31 08:46:16,137 EPOCH 141\n",
            "2019-12-31 08:46:25,679 Epoch 141 Step:    37900 Batch Loss:     0.720358 Tokens per Sec:    14316, Lr: 0.000050\n",
            "2019-12-31 08:46:40,202 Epoch 141 Step:    38000 Batch Loss:     0.777925 Tokens per Sec:    13369, Lr: 0.000050\n",
            "2019-12-31 08:47:13,748 Example #0\n",
            "2019-12-31 08:47:13,748 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 08:47:13,748 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 08:47:13,749 \tHypothesis: Ihoho nana pha ghanghanre vwẹ idjerhe tiọna , ayen muegbe rẹ ayen vwo nene odjekẹ rẹ avwanre .\n",
            "2019-12-31 08:47:13,749 Example #1\n",
            "2019-12-31 08:47:13,749 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 08:47:13,749 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 08:47:13,749 \tHypothesis: Asaọkiephana , ọ ga vwẹ Bẹtẹl asaọkiephana .\n",
            "2019-12-31 08:47:13,749 Example #2\n",
            "2019-12-31 08:47:13,749 \tSource:     But freedom from what ?\n",
            "2019-12-31 08:47:13,749 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 08:47:13,749 \tHypothesis: Kẹ egbomọphẹ vọ yen o vwo ruo ?\n",
            "2019-12-31 08:47:13,749 Example #3\n",
            "2019-12-31 08:47:13,749 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 08:47:13,749 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 08:47:13,749 \tHypothesis: Ọ je hepha kẹ ukoko wẹn vẹ ukoko wẹn jovwo re .\n",
            "2019-12-31 08:47:13,750 Validation result (greedy) at epoch 141, step    38000: bleu:  14.58, loss: 65116.2500, ppl:  22.5036, duration: 33.5468s\n",
            "2019-12-31 08:47:28,335 Epoch 141 Step:    38100 Batch Loss:     0.362765 Tokens per Sec:    14178, Lr: 0.000050\n",
            "2019-12-31 08:47:28,920 Epoch 141: total training loss 160.48\n",
            "2019-12-31 08:47:28,920 EPOCH 142\n",
            "2019-12-31 08:47:42,803 Epoch 142 Step:    38200 Batch Loss:     0.508243 Tokens per Sec:    13863, Lr: 0.000050\n",
            "2019-12-31 08:47:57,412 Epoch 142 Step:    38300 Batch Loss:     0.531780 Tokens per Sec:    13963, Lr: 0.000050\n",
            "2019-12-31 08:48:08,312 Epoch 142: total training loss 161.02\n",
            "2019-12-31 08:48:08,312 EPOCH 143\n",
            "2019-12-31 08:48:11,844 Epoch 143 Step:    38400 Batch Loss:     0.644076 Tokens per Sec:    13629, Lr: 0.000050\n",
            "2019-12-31 08:48:26,408 Epoch 143 Step:    38500 Batch Loss:     0.673954 Tokens per Sec:    14067, Lr: 0.000050\n",
            "2019-12-31 08:48:40,746 Epoch 143 Step:    38600 Batch Loss:     0.714925 Tokens per Sec:    13698, Lr: 0.000050\n",
            "2019-12-31 08:48:47,597 Epoch 143: total training loss 161.21\n",
            "2019-12-31 08:48:47,597 EPOCH 144\n",
            "2019-12-31 08:48:55,251 Epoch 144 Step:    38700 Batch Loss:     0.735921 Tokens per Sec:    13769, Lr: 0.000050\n",
            "2019-12-31 08:49:09,757 Epoch 144 Step:    38800 Batch Loss:     0.715990 Tokens per Sec:    13943, Lr: 0.000050\n",
            "2019-12-31 08:49:24,254 Epoch 144 Step:    38900 Batch Loss:     0.320705 Tokens per Sec:    14152, Lr: 0.000050\n",
            "2019-12-31 08:49:26,711 Epoch 144: total training loss 160.16\n",
            "2019-12-31 08:49:26,711 EPOCH 145\n",
            "2019-12-31 08:49:38,919 Epoch 145 Step:    39000 Batch Loss:     0.636216 Tokens per Sec:    14106, Lr: 0.000050\n",
            "2019-12-31 08:50:12,367 Example #0\n",
            "2019-12-31 08:50:12,367 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 08:50:12,367 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 08:50:12,367 \tHypothesis: Ihoho nana pha ghanghanre vwẹ idjerhe rẹ ẹwẹn avwanre se vwo muegbe rẹ ayen vwo nene odjekẹ na .\n",
            "2019-12-31 08:50:12,367 Example #1\n",
            "2019-12-31 08:50:12,367 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 08:50:12,367 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 08:50:12,367 \tHypothesis: Asaọkiephana , ọ ga vwẹ Bẹtẹl asaọkiephana .\n",
            "2019-12-31 08:50:12,367 Example #2\n",
            "2019-12-31 08:50:12,367 \tSource:     But freedom from what ?\n",
            "2019-12-31 08:50:12,368 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 08:50:12,368 \tHypothesis: Die kọyen egbomọphẹ ?\n",
            "2019-12-31 08:50:12,368 Example #3\n",
            "2019-12-31 08:50:12,368 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 08:50:12,368 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 08:50:12,368 \tHypothesis: Ọ je dianẹ ovwan je hepha vwẹ ukoko wẹn ro vwo oba na ọfa - a .\n",
            "2019-12-31 08:50:12,368 Validation result (greedy) at epoch 145, step    39000: bleu:  14.54, loss: 65140.5664, ppl:  22.5297, duration: 33.4482s\n",
            "2019-12-31 08:50:26,878 Epoch 145 Step:    39100 Batch Loss:     0.511276 Tokens per Sec:    13772, Lr: 0.000050\n",
            "2019-12-31 08:50:39,297 Epoch 145: total training loss 158.48\n",
            "2019-12-31 08:50:39,298 EPOCH 146\n",
            "2019-12-31 08:50:41,497 Epoch 146 Step:    39200 Batch Loss:     0.620622 Tokens per Sec:    13332, Lr: 0.000050\n",
            "2019-12-31 08:50:56,134 Epoch 146 Step:    39300 Batch Loss:     0.690100 Tokens per Sec:    13946, Lr: 0.000050\n",
            "2019-12-31 08:51:10,724 Epoch 146 Step:    39400 Batch Loss:     0.558163 Tokens per Sec:    13737, Lr: 0.000050\n",
            "2019-12-31 08:51:18,756 Epoch 146: total training loss 159.18\n",
            "2019-12-31 08:51:18,756 EPOCH 147\n",
            "2019-12-31 08:51:25,382 Epoch 147 Step:    39500 Batch Loss:     0.600398 Tokens per Sec:    14074, Lr: 0.000050\n",
            "2019-12-31 08:51:39,805 Epoch 147 Step:    39600 Batch Loss:     0.676031 Tokens per Sec:    13651, Lr: 0.000050\n",
            "2019-12-31 08:51:54,327 Epoch 147 Step:    39700 Batch Loss:     0.660990 Tokens per Sec:    14191, Lr: 0.000050\n",
            "2019-12-31 08:51:58,040 Epoch 147: total training loss 159.34\n",
            "2019-12-31 08:51:58,040 EPOCH 148\n",
            "2019-12-31 08:52:08,862 Epoch 148 Step:    39800 Batch Loss:     0.424601 Tokens per Sec:    13635, Lr: 0.000050\n",
            "2019-12-31 08:52:23,395 Epoch 148 Step:    39900 Batch Loss:     0.631629 Tokens per Sec:    13917, Lr: 0.000050\n",
            "2019-12-31 08:52:37,440 Epoch 148: total training loss 158.73\n",
            "2019-12-31 08:52:37,440 EPOCH 149\n",
            "2019-12-31 08:52:38,070 Epoch 149 Step:    40000 Batch Loss:     0.667331 Tokens per Sec:    13908, Lr: 0.000050\n",
            "2019-12-31 08:53:11,522 Example #0\n",
            "2019-12-31 08:53:11,523 \tSource:     These orchestral arrangements are composed in such a way that they will prepare our heart and mind for the program to follow .\n",
            "2019-12-31 08:53:11,523 \tReference:  E ru uhworo nana vwẹ idjerhe ro de se muegbe rẹ ubiudu rẹ avwanre hẹrhẹ ọrhuẹrẹphiyọ rẹ ẹdẹ yena .\n",
            "2019-12-31 08:53:11,523 \tHypothesis: Ihoho nana nẹrhẹ ihwo roro nẹ ayen che muegbe rẹ ubiudu avwanre , je reyọ oborẹ avwanre yonori .\n",
            "2019-12-31 08:53:11,523 Example #1\n",
            "2019-12-31 08:53:11,523 \tSource:     Today he is serving at Bethel .\n",
            "2019-12-31 08:53:11,523 \tReference:  Nonẹna , ọ ga vwẹ Bẹtẹl .\n",
            "2019-12-31 08:53:11,523 \tHypothesis: Asaọkiephana , ọ ga vwẹ Bẹtẹl asaọkiephana .\n",
            "2019-12-31 08:53:11,523 Example #2\n",
            "2019-12-31 08:53:11,523 \tSource:     But freedom from what ?\n",
            "2019-12-31 08:53:11,523 \tReference:  Ẹkẹvuọvo , ẹdia vọ yen egbomọphẹ na che si ayen nu ?\n",
            "2019-12-31 08:53:11,524 \tHypothesis: Die kọyen egbomọphẹ ?\n",
            "2019-12-31 08:53:11,524 Example #3\n",
            "2019-12-31 08:53:11,524 \tSource:     Avoid comparing your new congregation with your previous one .\n",
            "2019-12-31 08:53:11,524 \tReference:  Wọ vwẹ ukoko kpokpọ na vwọ vwanvwen ọ rẹ wo nurhe na - a .\n",
            "2019-12-31 08:53:11,524 \tHypothesis: Ọ je dianẹ ovwan vẹ ukoko kpokpọ na ọfa vwo oka rẹ ovwan ro chekọ - a .\n",
            "2019-12-31 08:53:11,524 Validation result (greedy) at epoch 149, step    40000: bleu:  15.22, loss: 65266.7891, ppl:  22.6661, duration: 33.4538s\n",
            "2019-12-31 08:53:25,998 Epoch 149 Step:    40100 Batch Loss:     0.552748 Tokens per Sec:    14109, Lr: 0.000050\n",
            "2019-12-31 08:53:40,499 Epoch 149 Step:    40200 Batch Loss:     0.584819 Tokens per Sec:    13803, Lr: 0.000050\n",
            "2019-12-31 08:53:49,957 Epoch 149: total training loss 157.60\n",
            "2019-12-31 08:53:49,958 EPOCH 150\n",
            "2019-12-31 08:53:54,998 Epoch 150 Step:    40300 Batch Loss:     0.602546 Tokens per Sec:    15056, Lr: 0.000050\n",
            "2019-12-31 08:54:09,371 Epoch 150 Step:    40400 Batch Loss:     0.319972 Tokens per Sec:    13679, Lr: 0.000050\n",
            "2019-12-31 08:54:23,947 Epoch 150 Step:    40500 Batch Loss:     0.531706 Tokens per Sec:    13793, Lr: 0.000050\n",
            "2019-12-31 08:54:29,146 Epoch 150: total training loss 157.35\n",
            "2019-12-31 08:54:29,146 Training ended after 150 epochs.\n",
            "2019-12-31 08:54:29,146 Best validation result (greedy) at step     7000:  13.40 ppl.\n",
            "2019-12-31 08:54:48,104  dev bleu:  11.80 [Beam search decoding with beam size = 5 and alpha = 1.0]\n",
            "2019-12-31 08:54:48,104 Translations saved to: models/enurh_transformer/00007000.hyps.dev\n",
            "2019-12-31 08:55:16,292 test bleu:  22.39 [Beam search decoding with beam size = 5 and alpha = 1.0]\n",
            "2019-12-31 08:55:16,293 Translations saved to: models/enurh_transformer/00007000.hyps.test\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "MBoDS09JM807",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 34
        },
        "outputId": "ace2c1d7-6b25-4f09-9fef-c5b150787a61"
      },
      "source": [
        "# Copy the created models from the notebook storage to google drive for persistant storage \n",
        "#!cp -r joeynmt/models/${src}${tgt}_transformer/* \"$gdrive_path/models/${src}${tgt}_transformer/\"\n",
        "!cp -r joeynmt/models/${src}${tgt}_transformer/* drive/'My Drive'/masakhane/en-urh-baseline/models/enurh_transformer"
      ],
      "execution_count": 45,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "cp: cannot create symbolic link 'drive/My Drive/masakhane/en-urh-baseline/models/enurh_transformer/best.ckpt': Operation not supported\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "n94wlrCjVc17",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 697
        },
        "outputId": "0208b579-1ac9-4bdd-ad80-8ded7258c35d"
      },
      "source": [
        "# Output our validation accuracy\n",
        "! cat \"$gdrive_path/models/${src}${tgt}_transformer/validations.txt\""
      ],
      "execution_count": 46,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Steps: 1000\tLoss: 78227.96875\tPPL: 42.12487\tbleu: 0.43998\tLR: 0.00030000\t*\n",
            "Steps: 2000\tLoss: 67111.74219\tPPL: 24.75660\tbleu: 2.83394\tLR: 0.00030000\t*\n",
            "Steps: 3000\tLoss: 61367.63281\tPPL: 18.81069\tbleu: 4.77393\tLR: 0.00030000\t*\n",
            "Steps: 4000\tLoss: 57640.03906\tPPL: 15.73964\tbleu: 6.48525\tLR: 0.00030000\t*\n",
            "Steps: 5000\tLoss: 55712.81250\tPPL: 14.35399\tbleu: 8.95064\tLR: 0.00030000\t*\n",
            "Steps: 6000\tLoss: 54999.32422\tPPL: 13.87253\tbleu: 10.15556\tLR: 0.00030000\t*\n",
            "Steps: 7000\tLoss: 54282.33984\tPPL: 13.40498\tbleu: 11.26286\tLR: 0.00030000\t*\n",
            "Steps: 8000\tLoss: 54547.91406\tPPL: 13.57630\tbleu: 12.18470\tLR: 0.00030000\t\n",
            "Steps: 9000\tLoss: 54701.26953\tPPL: 13.67622\tbleu: 12.61264\tLR: 0.00030000\t\n",
            "Steps: 10000\tLoss: 55179.74609\tPPL: 13.99273\tbleu: 12.95071\tLR: 0.00030000\t\n",
            "Steps: 11000\tLoss: 55769.63672\tPPL: 14.39304\tbleu: 13.16508\tLR: 0.00030000\t\n",
            "Steps: 12000\tLoss: 56564.67188\tPPL: 14.95075\tbleu: 13.49432\tLR: 0.00030000\t\n",
            "Steps: 13000\tLoss: 56796.05859\tPPL: 15.11708\tbleu: 13.26063\tLR: 0.00021000\t\n",
            "Steps: 14000\tLoss: 57613.84375\tPPL: 15.71993\tbleu: 13.94025\tLR: 0.00021000\t\n",
            "Steps: 15000\tLoss: 58165.76562\tPPL: 16.14033\tbleu: 13.85035\tLR: 0.00021000\t\n",
            "Steps: 16000\tLoss: 58579.19922\tPPL: 16.46258\tbleu: 14.42836\tLR: 0.00021000\t\n",
            "Steps: 17000\tLoss: 59104.80469\tPPL: 16.88158\tbleu: 14.35280\tLR: 0.00021000\t\n",
            "Steps: 18000\tLoss: 59971.57812\tPPL: 17.59597\tbleu: 13.90183\tLR: 0.00021000\t\n",
            "Steps: 19000\tLoss: 60328.69141\tPPL: 17.89902\tbleu: 14.09911\tLR: 0.00014700\t\n",
            "Steps: 20000\tLoss: 60525.47266\tPPL: 18.06823\tbleu: 14.55474\tLR: 0.00014700\t\n",
            "Steps: 21000\tLoss: 61023.71094\tPPL: 18.50387\tbleu: 14.48248\tLR: 0.00014700\t\n",
            "Steps: 22000\tLoss: 61665.95703\tPPL: 19.08094\tbleu: 14.40739\tLR: 0.00014700\t\n",
            "Steps: 23000\tLoss: 61697.98047\tPPL: 19.11018\tbleu: 14.12732\tLR: 0.00014700\t\n",
            "Steps: 24000\tLoss: 62261.51953\tPPL: 19.63214\tbleu: 14.51305\tLR: 0.00014700\t\n",
            "Steps: 25000\tLoss: 62652.58594\tPPL: 20.00271\tbleu: 14.48856\tLR: 0.00010290\t\n",
            "Steps: 26000\tLoss: 62752.07422\tPPL: 20.09810\tbleu: 14.72953\tLR: 0.00010290\t\n",
            "Steps: 27000\tLoss: 62960.05859\tPPL: 20.29897\tbleu: 15.06399\tLR: 0.00010290\t\n",
            "Steps: 28000\tLoss: 63579.70703\tPPL: 20.90943\tbleu: 15.01341\tLR: 0.00010290\t\n",
            "Steps: 29000\tLoss: 63843.86719\tPPL: 21.17521\tbleu: 14.88903\tLR: 0.00010290\t\n",
            "Steps: 30000\tLoss: 63885.11719\tPPL: 21.21703\tbleu: 14.17094\tLR: 0.00010290\t\n",
            "Steps: 31000\tLoss: 64090.25391\tPPL: 21.42617\tbleu: 14.80125\tLR: 0.00007203\t\n",
            "Steps: 32000\tLoss: 64068.02734\tPPL: 21.40341\tbleu: 15.04242\tLR: 0.00007203\t\n",
            "Steps: 33000\tLoss: 64300.80859\tPPL: 21.64298\tbleu: 15.01238\tLR: 0.00007203\t\n",
            "Steps: 34000\tLoss: 64409.68359\tPPL: 21.75595\tbleu: 14.89129\tLR: 0.00007203\t\n",
            "Steps: 35000\tLoss: 64785.31250\tPPL: 22.15025\tbleu: 14.91799\tLR: 0.00007203\t\n",
            "Steps: 36000\tLoss: 64826.36719\tPPL: 22.19377\tbleu: 14.49856\tLR: 0.00007203\t\n",
            "Steps: 37000\tLoss: 65046.75000\tPPL: 22.42889\tbleu: 14.71071\tLR: 0.00005042\t\n",
            "Steps: 38000\tLoss: 65116.25000\tPPL: 22.50355\tbleu: 14.57850\tLR: 0.00005042\t\n",
            "Steps: 39000\tLoss: 65140.56641\tPPL: 22.52973\tbleu: 14.54128\tLR: 0.00005042\t\n",
            "Steps: 40000\tLoss: 65266.78906\tPPL: 22.66612\tbleu: 15.21854\tLR: 0.00005042\t\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "66WhRE9lIhoD",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 68
        },
        "outputId": "fe87bc05-5f40-41f8-95f9-8bd7d982a49d"
      },
      "source": [
        "# Test our model\n",
        "! cd joeynmt; python3 -m joeynmt test \"$gdrive_path/models/${src}${tgt}_transformer/config.yaml\""
      ],
      "execution_count": 47,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "2019-12-31 08:55:58,017 Hello! This is Joey-NMT.\n",
            "2019-12-31 08:56:19,658  dev bleu:  11.80 [Beam search decoding with beam size = 5 and alpha = 1.0]\n",
            "2019-12-31 08:56:47,548 test bleu:  22.39 [Beam search decoding with beam size = 5 and alpha = 1.0]\n"
          ],
          "name": "stdout"
        }
      ]
    }
  ]
}