Epoch: 0 Epoch: 0 Training loss: 0.31682723969221116 - MAE: 0.4270551883918429 Validation loss : 0.17406331966905034 - MAE: 0.32230179369633777 Epoch: 1 Training loss: 0.16770198747515677 - MAE: 0.31259940333736164 Validation loss : 0.1586608391474275 - MAE: 0.30445839043374284 Epoch: 2 Training loss: 0.159769741743803 - MAE: 0.3024783775537543 Validation loss : 0.1531248894684455 - MAE: 0.2979603761303602 Epoch: 3 Training loss: 0.15299145340919496 - MAE: 0.2954564989999413 Validation loss : 0.14940124268040939 - MAE: 0.2937186717776503 Epoch: 4 Training loss: 0.14963373884558678 - MAE: 0.2927651877520682 Validation loss : 0.1465359816656393 - MAE: 0.28962546403058864 Epoch: 5 Training loss: 0.14846575662493705 - MAE: 0.2924354030478544 Validation loss : 0.14600852613939957 - MAE: 0.2901171886959917 Epoch: 6 Training loss: 0.1461155606806278 - MAE: 0.28910296909082894 Validation loss : 0.1443572162705309 - MAE: 0.2882029863669967 Epoch: 7 Training loss: 0.1435057543218136 - MAE: 0.28626437472780547 Validation loss : 0.14302697953055887 - MAE: 0.28601234103062984 Epoch: 8 Training loss: 0.14254434838891028 - MAE: 0.28481520755382117 Validation loss : 0.1440344860448557 - MAE: 0.28876987237318597 Epoch: 9 Training loss: 0.14220659375190736 - MAE: 0.2843890291073015 Validation loss : 0.14513540706213782 - MAE: 0.29083508437670147 Epoch: 10 Training loss: 0.14016222164034844 - MAE: 0.284439432952026 Validation loss : 0.14425752794041352 - MAE: 0.2893868117236209 Epoch: 11 Training loss: 0.13950059577822685 - MAE: 0.28205096776739585 Validation loss : 0.14302689625936396 - MAE: 0.2873420180144296 Epoch: 12 Training loss: 0.14032857306301594 - MAE: 0.2838538819069717 Validation loss : 0.14159568223883123 - MAE: 0.2846705210276279 Epoch: 13 Training loss: 0.13777491554617882 - MAE: 0.2811532460436431 Validation loss : 0.1402983827626004 - MAE: 0.28214487482668094 Epoch: 14 Training loss: 0.1384480883181095 - MAE: 0.2826869318220167 Validation loss : 0.14003691515501807 - MAE: 0.2823408793685602 Epoch: 15 Training loss: 0.1364074445515871 - MAE: 0.27945450259770427 Validation loss : 0.14107048862120686 - MAE: 0.2845604702301118 Epoch: 16 Training loss: 0.13790464267134667 - MAE: 0.2811272648348582 Validation loss : 0.13953395713778102 - MAE: 0.2813538407893107 Epoch: 17 Training loss: 0.13609110862016677 - MAE: 0.27886227584191275 Validation loss : 0.1393512728459695 - MAE: 0.28146348314429903 Epoch: 18 Training loss: 0.1379956752061844 - MAE: 0.2805913614355465 Validation loss : 0.14084946802433798 - MAE: 0.2841884875193426 Epoch: 19 Training loss: 0.13686978235840797 - MAE: 0.27852323035389914 Validation loss : 0.1389886044404086 - MAE: 0.2800740030523367 Epoch: 20 Training loss: 0.13681577205657958 - MAE: 0.27942651378620287 Validation loss : 0.13925977927796981 - MAE: 0.28123826139713554 Epoch: 21 Training loss: 0.13500447817146777 - MAE: 0.2771479085296469 Validation loss : 0.13857040510458105 - MAE: 0.2804170190528482 Epoch: 22 Training loss: 0.13786660879850388 - MAE: 0.2803796913909176 Validation loss : 0.13938766323468266 - MAE: 0.28218952799424674 Epoch: 23 Training loss: 0.1356737055629492 - MAE: 0.27760473376719097 Validation loss : 0.1387395648395314 - MAE: 0.28104305604965496 Epoch: 24 Training loss: 0.1354680922627449 - MAE: 0.2783789680019564 Validation loss : 0.13770897116731196 - MAE: 0.2785631474709757 Epoch: 25 Training loss: 0.13535136304795742 - MAE: 0.2780364614592138 Validation loss : 0.1387501472935957 - MAE: 0.2815106612637648 Epoch: 26 Training loss: 0.13641170173883438 - MAE: 0.2785825396623511 Validation loss : 0.1377785275964176 - MAE: 0.2779127802398686 Epoch: 27 Training loss: 0.13511715725064277 - MAE: 0.2772811605749396 Validation loss : 0.13769023996942184 - MAE: 0.2791481897868475 Epoch: 28 Training loss: 0.13464232258498668 - MAE: 0.2768529460597231 Validation loss : 0.13858693136888392 - MAE: 0.28120509094330265 Epoch: 29 Training loss: 0.13505111686885357 - MAE: 0.27770146252380484 Validation loss : 0.13782754584270365 - MAE: 0.27865634634092107 Epoch: 30 Training loss: 0.1361152843385935 - MAE: 0.2788007818749053 Validation loss : 0.13823135328643463 - MAE: 0.2778680444780777 Epoch: 31 Training loss: 0.13569585122168065 - MAE: 0.27721791551653874 Validation loss : 0.1382857184199726 - MAE: 0.27900113514637637 Epoch: 32 Training loss: 0.13462988868355752 - MAE: 0.27621357082625864 Validation loss : 0.13866162782206254 - MAE: 0.280906721333951 Epoch: 33 Training loss: 0.1322924195975065 - MAE: 0.2759485545392573 Validation loss : 0.13808968619388692 - MAE: 0.27962836651910467 Epoch: 34 Training loss: 0.1350867758691311 - MAE: 0.277744652500543 Validation loss : 0.13849331613849192 - MAE: 0.2774987002315543 Epoch: 35 Training loss: 0.13530310183763505 - MAE: 0.27795139872194674 Validation loss : 0.13775512619930155 - MAE: 0.2783110165214615 Epoch: 36 Training loss: 0.1330907303839922 - MAE: 0.27542688121103176 Validation loss : 0.13827609752907472 - MAE: 0.278545462247842 Epoch: 37 Training loss: 0.1348443178832531 - MAE: 0.2772734767600124 Validation loss : 0.13749399649746277 - MAE: 0.27805317930598505 Epoch: 38 Training loss: 0.13353365540504455 - MAE: 0.27670099473229837 Validation loss : 0.13741220139405308 - MAE: 0.2781038571154334 Epoch: 39 Training loss: 0.13391907557845115 - MAE: 0.2772429205417003 Validation loss : 0.1373393614502514 - MAE: 0.27825494666120654 Epoch: 40 Training loss: 0.1353518583625555 - MAE: 0.27881707202145656