|
gpt2-turkish-wiki |
|
|
|
Current version is demo only with some trained wikipedia text in Turkish. |
|
|
|
Using modified https://github.com/piegu/fastai-projects/blob/master/finetuning-English-GPT2-any-language-Portuguese-HuggingFace-fastaiv2_FAST.ipynb |
|
|
|
Inference is not so good at the moment. |
|
|
|
Epoch train_loss valid_loss accuracy perplexity time |
|
0 4.373726 5.398773 0.264228 221.134857 02:56 |
|
1 4.264910 5.344171 0.267870 209.384140 02:54 |
|
|
|
|
|
TODO: Total turkish wikipedia text is 3 GB xml file |
|
|
|
1 epoch training on full wikipedia turkish gave some good results, will update here when have full model |
|
epoch train_loss valid_loss accuracy perplexity time |
|
0 3.948997 4.001249 0.330571 54.666405 2:41:54 |
|
|
|
|