gpt2-small-turkish / README.md
gorkemgoknar's picture
Update README.md
ab34bf6

gpt2-turkish-wiki

Current version is demo only with 250mb trained wikipedia text in Turkish.

Using modified https://github.com/piegu/fastai-projects/blob/master/finetuning-English-GPT2-any-language-Portuguese-HuggingFace-fastaiv2_FAST.ipynb

Inference is not so good at the moment.

Epoch train_loss valid_loss accuracy perplexity time 0 4.373726 5.398773 0.264228 221.134857 02:56 1 4.264910 5.344171 0.267870 209.384140 02:54

TODO: Total turkish wikipedia text is 3GB unpacked currently.