Edit model card

๐Ÿ‡ท๐Ÿ‡ด Romanian mGPT 1.3B

Language model for Romanian. Model has 1.3B parameters as you can guess from it's name.

Romanian belongs to Indo-European language family. It's a very lyrical language with approximately 24 million speakers. Here are some facts about it:

  1. It is a Romance language, closely related to Italian, French, Spanish, Portuguese, and Catalan.
  2. It retains several Latin characteristics, making it unique among the Romance languages.
  3. While primarily spoken in Romania and Moldova, there are also Romanian speakers in neighboring countries and diaspora communities worldwide.

Technical details

It's one of the models derived from the base mGPT-XL (1.3B) model (see the list below) which was originally trained on the 61 languages from 25 language families using Wikipedia and C4 corpus.

We've found additional data for 23 languages most of which are considered as minor and decided to further tune the base model. Romanian mGPT 1.3B was trained for another 5000 steps with batch_size=4 and context window of 2048 tokens on 1 A100.

Final perplexity for this model on validation is 3.44.

Chart of the training loss and perplexity:

Other mGPT-1.3B models

Feedback

If you'll found a bug of have additional data to train model on your language โ€” please, give us feedback.

Model will be improved over time. Stay tuned!

Downloads last month
12

Space using ai-forever/mGPT-1.3B-romanian 1