moussaKam commited on
Commit
cb67d61
1 Parent(s): dccfab0

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +21 -3
README.md CHANGED
@@ -1,3 +1,21 @@
1
- ---
2
- license: apache-2.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - summarization
4
+ - bart
5
+
6
+ language:
7
+ - ar
8
+ widget:
9
+ - text: بيروت هي عاصمة <mask>.
10
+
11
+ license: apache-2.0
12
+
13
+ pipeline_tag: "fill-mask"
14
+ ---
15
+
16
+ AraBART is the first Arabic model in which the encoder and the decoder are pretrained end-to-end, based on BART. AraBART follows the architecture of BART-Base
17
+ which has 6 encoder and 6 decoder layers and 768 hidden dimensions. In total AraBART has 139M parameters.
18
+
19
+ AraBART achieves the best performance on multiple abstractive summarization datasets, outperforming strong baselines including a pretrained Arabic BERT-based models and multilingual mBART and mT5 models.
20
+
21
+