GPT2-Persian

bolbolzaban/gpt2-persian is gpt2 language model that is trained with hyper parameters similar to standard gpt2-medium with following differences:

  1. The context size is reduced from 1024 to 256 sub words in order to make the training affordable
  2. Instead of BPE, google sentence piece tokenizor is used for tokenization.
  3. The training dataset only include Persian text. All non-persian characters are replaced with especial tokens (e.g [LAT], [URL], [NUM])

Please refer to this blog post for further detail. Also try the model here or on Bolbolzaban.com.

How to use

You can use this model directly with a pipeline for text generation:

from transformers import pipeline, AutoTokenizer, GPT2LMHeadModel
tokenizer = AutoTokenizer.from_pretrained('bolbolzaban/gpt2-persian')
model = GPT2LMHeadModel.from_pretrained('bolbolzaban/gpt2-persian')
generator = pipeline('text-generation', model, tokenizer=tokenizer, config={'max_length':256})
sample = generator('در یک اتفاق شگفت انگیز، پژوهشگران')

If you are using Tensorflow import TFGPT2LMHeadModel instead of GPT2LMHeadModel.

Fine-tuning

Find a basic fine-tuning example on this Github Repo.

Special Tokens

gpt-persian is trained for the purpose of research on Persian poetry. Because of that all english words and numbers are replaced with special tokens and only standard Persian alphabet is used as part of input text. Here is one example:

Original text: اگر آیفون یا آیپد شما دارای سیستم عامل iOS 14.3 یا iPadOS 14.3 یا نسخه‌های جدیدتر باشد

Text used in training: اگر آیفون یا آیپد شما دارای سیستم عامل [LAT] [NUM] یا [LAT] [NUM] یا نسخه‌های جدیدتر باشد

Please consider normalizing your input text using Hazm or similar libraries and ensure only Persian characters are provided as input.

If you want to use classical Persian poetry as input use [BOM] (begining of mesra) at the beginning of each verse (مصرع) followed by [EOS] (end of statement) at the end of each couplet (بیت).

See following links for example:

[BOM] توانا بود

[BOM] توانا بود هر که دانا بود [BOM]

[BOM] توانا بود هر که دانا بود [BOM] ز دانش دل پیر

[BOM] توانا بود هر که دانا بود [BOM] ز دانش دل پیربرنا بود [EOS]

If you like to know about structure of classical Persian poetry refer to these blog posts.

Acknowledgment

This project is supported by Cloud TPUs from Google’s TensorFlow Research Cloud (TFRC).

Citation and Reference

Please reference "bolbolzaban.com" website if you are using gpt2-persian in your research or commertial application.

Contacts

Please reachout on Linkedin or Telegram if you have any question or need any help to use the model.

Follow Bolbolzaban on Twitter, Telegram or Instagram

Downloads last month
2,792
Hosted inference API
Text Generation
This model can be loaded on the Inference API on-demand.