metadata
language: id
widget:
- text: Wahai rembulan yang tertutup awan hujan
Indonesian GPT-2 finetuned on Indonesian poems
This is the Indonesian gpt2-small model fine-tuned to Indonesian poems. The dataset can be found in here All training was done on Google Colab Jupyter Notebook (soon).
The information of the sub-dataset and the distribution of the training and evaluation dataset are as follows:
split | count | percentage |
---|---|---|
train | 7,358 | 80% |
validation | 1,890 | 20% |
Evaluation results
The model evaluation results after 10 epochs are as follows:
dataset | train loss | eval loss | eval perplexity |
---|---|---|---|
id puisi | 3.43 | 3.54 | 34.47 |
The logs and parameters can be found in wandb page here