dqnguyen commited on
Commit
748f5b5
1 Parent(s): d2c8374

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +17 -0
README.md ADDED
@@ -0,0 +1,17 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # <a name="introduction"></a> BARTpho: Pre-trained Sequence-to-Sequence Models for Vietnamese
2
+
3
+
4
+ Two BARTpho versions `BARTpho-syllable` and `BARTpho-word` are the first public large-scale monolingual sequence-to-sequence models pre-trained for Vietnamese. BARTpho uses the "large" architecture and pre-training scheme of the sequence-to-sequence denoising model [BART](https://github.com/pytorch/fairseq/tree/main/examples/bart), thus especially suitable for generative NLP tasks. Experiments on a downstream task of Vietnamese text summarization show that in both automatic and human evaluations, BARTpho outperforms the strong baseline [mBART](https://github.com/pytorch/fairseq/tree/main/examples/mbart) and improves the state-of-the-art.
5
+
6
+ The general architecture and experimental results of BARTpho can be found in our [paper](https://arxiv.org/abs/2109.09701):
7
+
8
+ @article{bartpho,
9
+ title = {{BARTpho: Pre-trained Sequence-to-Sequence Models for Vietnamese}},
10
+ author = {Nguyen Luong Tran and Duong Minh Le and Dat Quoc Nguyen},
11
+ journal = {arXiv preprint},
12
+ volume = {arXiv:2109.09701},
13
+ year = {2021}
14
+ }
15
+ **Please CITE** our paper when BARTpho is used to help produce published results or incorporated into other software.
16
+
17
+ For further information or requests, please go to [BARTpho's homepage](https://github.com/VinAIResearch/BARTpho)!