Edit model card

Indonesian GPT-2-medium finetuned on Indonesian poems

This is the Indonesian gpt2-medium model fine-tuned to Indonesian poems. The dataset can be found in here All training was done on Google Colab Jupyter Notebook (soon).

The dataset is splitted into two subset with details belows:

split count (examples) percentage
train 7,358 80%
validation 1,890 20%

Evaluation results

The model evaluation results after 10 epochs are as follows:

dataset train/loss eval/loss eval perplexity
id puisi 3.104 3.384 29.4884

The logs can be found in wandb page here

Downloads last month
16
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.