m3hrdadfi commited on
Commit
96f4b58
1 Parent(s): ef954c2

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +83 -0
README.md ADDED
@@ -0,0 +1,83 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: fa
3
+ license: apache-2.0
4
+ ---
5
+
6
+ # ALBERT Persian
7
+
8
+ A Lite BERT for Self-supervised Learning of Language Representations for the Persian Language
9
+
10
+ > میتونی بهش بگی برت_کوچولو
11
+
12
+ [ALBERT-Persian](https://github.com/m3hrdadfi/albert-persian) is the first attempt on ALBERT for the Persian Language. The model was trained based on Google's ALBERT BASE Version 2.0 over various writing styles from numerous subjects (e.g., scientific, novels, news) with more than 3.9M documents, 73M sentences, and 1.3B words, like the way we did for ParsBERT.
13
+
14
+ Please follow the [ALBERT-Persian](https://github.com/m3hrdadfi/albert-persian) repo for the latest information about previous and current models.
15
+
16
+ ## Persian Text Classification [DigiMag, Persian News]
17
+
18
+ The task target is labeling texts in a supervised manner in both existing datasets `DigiMag` and `Persian News`.
19
+
20
+
21
+ ### DigiMag
22
+
23
+ A total of 8,515 articles scraped from [Digikala Online Magazine](https://www.digikala.com/mag/). This dataset includes seven different classes.
24
+
25
+ 1. Video Games
26
+ 2. Shopping Guide
27
+ 3. Health Beauty
28
+ 4. Science Technology
29
+ 5. General
30
+ 6. Art Cinema
31
+ 7. Books Literature
32
+
33
+
34
+ | Label | # |
35
+ |:------------------:|:----:|
36
+ | Video Games | 1967 |
37
+ | Shopping Guide | 125 |
38
+ | Health Beauty | 1610 |
39
+ | Science Technology | 2772 |
40
+ | General | 120 |
41
+ | Art Cinema | 1667 |
42
+ | Books Literature | 254 |
43
+
44
+
45
+ **Download**
46
+ You can download the dataset from [here](https://drive.google.com/uc?id=1YgrCYY-Z0h2z0-PfWVfOGt1Tv0JDI-qz)
47
+
48
+
49
+ ## Results
50
+
51
+ The following table summarizes the F1 score obtained by ParsBERT as compared to other models and architectures.
52
+
53
+ | Dataset | ALBERT-fa-base-v2 | ParsBERT-v1 | mBERT |
54
+ |:-----------------:|:-----------------:|:-----------:|:-----:|
55
+ | Digikala Magazine | 92.33 | 93.59 | 90.72 |
56
+
57
+
58
+
59
+ ### BibTeX entry and citation info
60
+
61
+ Please cite in publications as the following:
62
+
63
+ ```bibtex
64
+ @misc{ALBERTPersian,
65
+ author = {Mehrdad Farahani},
66
+ title = {ALBERT-Persian: A Lite BERT for Self-supervised Learning of Language Representations for the Persian Language},
67
+ year = {2020},
68
+ publisher = {GitHub},
69
+ journal = {GitHub repository},
70
+ howpublished = {\url{https://github.com/m3hrdadfi/albert-persian}},
71
+ }
72
+
73
+ @article{ParsBERT,
74
+ title={ParsBERT: Transformer-based Model for Persian Language Understanding},
75
+ author={Mehrdad Farahani, Mohammad Gharachorloo, Marzieh Farahani, Mohammad Manthouri},
76
+ journal={ArXiv},
77
+ year={2020},
78
+ volume={abs/2005.12515}
79
+ }
80
+ ```
81
+
82
+ ## Questions?
83
+ Post a Github issue on the [ALBERT-Persian](https://github.com/m3hrdadfi/albert-persian) repo.