1 ---
2 language: fa
3 license: apache-2.0
4 ---
5
6 # ALBERT Persian
7
8 A Lite BERT for Self-supervised Learning of Language Representations for the Persian Language
9
10 > میتونی بهش بگی برت_کوچولو
11
12 [ALBERT-Persian](https://github.com/m3hrdadfi/albert-persian) is the first attempt on ALBERT for the Persian Language. The model was trained based on Google's ALBERT BASE Version 2.0 over various writing styles from numerous subjects (e.g., scientific, novels, news) with more than 3.9M documents, 73M sentences, and 1.3B words, like the way we did for ParsBERT.
13
14 Please follow the [ALBERT-Persian](https://github.com/m3hrdadfi/albert-persian) repo for the latest information about previous and current models.
15
16 ## Persian Sentiment [Digikala, SnappFood, DeepSentiPers]
17
18 It aims to classify text, such as comments, based on their emotional bias. We tested three well-known datasets for this task: `Digikala` user comments, `SnappFood` user comments, and `DeepSentiPers` in two binary-form and multi-form types.
19
20 ### Digikala
21
22 Digikala user comments provided by [Open Data Mining Program (ODMP)](https://www.digikala.com/opendata/). This dataset contains 62,321 user comments with three labels:
23
24 | Label | # |
25 |:---------------:|:------:|
26 | no_idea | 10394 |
27 | not_recommended | 15885 |
28 | recommended | 36042 |
29
30
31 **Download**
32 You can download the dataset from [here](https://www.digikala.com/opendata/)
33
34 ## Results
35
36 The following table summarizes the F1 score obtained as compared to other models and architectures.
37
38 | Dataset | ALBERT-fa-base-v2 | ParsBERT-v1 | mBERT | DeepSentiPers |
39 |:------------------------:|:-----------------:|:-----------:|:-----:|:-------------:|
40 | Digikala User Comments | 81.12 | 81.74 | 80.74 | - |
41
42
43 ### BibTeX entry and citation info
44
45 Please cite in publications as the following:
46
47 ```bibtex
48 @misc{ALBERTPersian,
49 author = {Mehrdad Farahani},
50 title = {ALBERT-Persian: A Lite BERT for Self-supervised Learning of Language Representations for the Persian Language},
51 year = {2020},
52 publisher = {GitHub},
53 journal = {GitHub repository},
54 howpublished = {\url{https://github.com/m3hrdadfi/albert-persian}},
55 }
56
57 @article{ParsBERT,
58 title={ParsBERT: Transformer-based Model for Persian Language Understanding},
59 author={Mehrdad Farahani, Mohammad Gharachorloo, Marzieh Farahani, Mohammad Manthouri},
60 journal={ArXiv},
61 year={2020},
62 volume={abs/2005.12515}
63 }
64 ```
65
66 ## Questions?
67 Post a Github issue on the [ALBERT-Persian](https://github.com/m3hrdadfi/albert-persian) repo.