File size: 497 Bytes
7ed2996
0bf48bc
 
 
 
 
 
 
 
 
 
ab34bf6
 
 
0bf48bc
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
gpt2-turkish-wiki

Current version is demo only with 250mb trained wikipedia text in Turkish.

Using modified   https://github.com/piegu/fastai-projects/blob/master/finetuning-English-GPT2-any-language-Portuguese-HuggingFace-fastaiv2_FAST.ipynb

Inference is not so good at the moment.

Epoch	train_loss	valid_loss	accuracy	perplexity	time
0	4.373726	5.398773	0.264228	221.134857	02:56
1	4.264910	5.344171	0.267870	209.384140	02:54


TODO: Total turkish wikipedia text is 3GB unpacked currently.