This is a finetunes version of keshan/sinhala-gpt2 with newswire articles. This was finetuned on ~12MB of data - Num examples=8395 - Batch size =8 It got a Perplexity of 3.15