Back to all models
text-generation mask_token:
Query this model
🔥 This model is currently loaded and running on the Inference API. ⚠️ This model could not be loaded by the inference API. ⚠️ This model can be loaded on the Inference API on-demand.
JSON Output
API endpoint  

⚡️ Upgrade your account to access the Inference API

							$
							curl -X POST \
-H "Authorization: Bearer YOUR_ORG_OR_USER_API_TOKEN" \
-H "Content-Type: application/json" \
-d '"json encoded string"' \
https://api-inference.huggingface.co/models/akhooli/gpt2-small-arabic-poetry
Share Copied link to clipboard

Monthly model downloads

akhooli/gpt2-small-arabic-poetry akhooli/gpt2-small-arabic-poetry
132 downloads
last 30 days

pytorch

tf

Contributed by

akhooli Abed Khooli
5 models

How to use this model directly from the 🤗/transformers library:

			
Copy to clipboard
from transformers import AutoTokenizer, AutoModelWithLMHead tokenizer = AutoTokenizer.from_pretrained("akhooli/gpt2-small-arabic-poetry") model = AutoModelWithLMHead.from_pretrained("akhooli/gpt2-small-arabic-poetry")

GPT2-Small-Arabic-Poetry

Model description

Fine-tuned model of Arabic poetry dataset based on gpt2-small-arabic.

Intended uses & limitations

How to use

An example is provided in this colab notebook.

Limitations and bias

Both the GPT2-small-arabic (trained on Arabic Wikipedia) and this model have several limitations in terms of coverage and training performance. Use them as demonstrations or proof of concepts but not as production code.

Training data

This pretrained model used the Arabic Poetry dataset from 9 different eras with a total of around 40k poems. The dataset was trained (fine-tuned) based on the gpt2-small-arabic transformer model.

Training procedure

Training was done using Simple Transformers library on Kaggle, using free GPU.

Eval results

Final perplexity reached ws 76.3, loss: 4.33

BibTeX entry and citation info

@inproceedings{Abed Khooli,
  year={2020}
}