mitra-mir commited on
Commit
f07ba74
1 Parent(s): f476959

update readme

Browse files

A Transformer-based Persian Language Model Further Pretrained on Persian Poetry

ALBERT was first introduced by Hooshvare with 30,000 vocabulary size as lite BERT for self-supervised learning of language representations for the Persian language. Here we wanted to utilize its capabilities by pretraining it on a large corpse of Persian poetry. This model has been post-trained on 80 percent of poetry verses of the Persian poetry dataset - Ganjoor- and has been evaluated on the other 20 percent.

Files changed (1) hide show
  1. README.md +1 -0
README.md CHANGED
@@ -0,0 +1 @@
 
 
1
+ ReadMe