Edit model card

Thepoet is an Arabic poem generator, pre-trained language model based on OpenAi GPT2 architechture.

Special thanks to aubmindlab for their pretrained Arabic model - Aragpt2 - large (https://huggingface.co/aubmindlab/aragpt2-large)

AraGPT2-large adafactor 1024 1280 20 36 2.98GB/792M

Trained on two huge (APCD) datasets:

512MB Arabic Poem Comprehensive Dataset from Kaggle (https://www.kaggle.com/datasets/mohamedkhaledelsafty/best-arabic-poem-comprehensive-dataset)

150MB Arabic Poem Dataset from Kaggle(https://www.kaggle.com/datasets/ahmedabelal/arabic-poetry)

Eval results

Final perplexity reached was 119.5661

BibTeX entry and citation info

@inproceedings{Mohamad El Abaji,
  year={2022}
}
Downloads last month
15
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.