Character-level gpt-2 like language model, trained on English articles from Wikipedia (https://huggingface.co/datasets/wikipedia#20220301en). We used GPT-2 implementation from Andrej Karpathy (https://github.com/karpathy/minGPT/blob/master/projects/chargpt/chargpt.py) and trained the model for 100k epochs.