Edit model card


This model is based on the smallest gpt2 model (137 M parameters) fine-tuned on the Gutenberg Poetry Corpus (142 MB of text) for 3 epochs.

The model generates lines of poetry, in English, each of them starting with the <start> tag, which can be removed through post-processing.

This model was trained by Teo Ferrari as part of his Bachelor thesis at HEIG-VD, supervised by Andrei Popescu-Belis. The model is described in "GPoeT: a Language Model Trained for Rhyme Generation on Synthetic Data" and is used in the CR-PO system for interactive poem generation, along with several other models for specific topics or emotions.

Downloads last month
Model size
137M params
Tensor type
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.