Edit model card
YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

BARTpho: Pre-trained Sequence-to-Sequence Models for Vietnamese

The pre-trained model vinai/bartpho-syllable-base is the "base" variant of BARTpho-syllable, which uses the "base" architecture and pre-training scheme of the sequence-to-sequence denoising model BART. The general architecture and experimental results of BARTpho can be found in our paper:

@article{bartpho,
title     = {{BARTpho: Pre-trained Sequence-to-Sequence Models for Vietnamese}},
author    = {Nguyen Luong Tran and Duong Minh Le and Dat Quoc Nguyen},
journal   = {arXiv preprint},
volume    = {arXiv:2109.09701},
year      = {2021}
}

Please CITE our paper when BARTpho is used to help produce published results or incorporated into other software.

For further information or requests, please go to BARTpho's homepage!

Downloads last month
1,106

Space using vinai/bartpho-syllable-base 1