typo: encoder-encoder -> encoder-decoder

by - opened

BART is an encoder-decoder, not an encoder-encoder.

It uses a standard Tranformer-based neural machine translation architecture which, despite its simplicity, can be seen as generalizing BERT (due to the bidirectional encoder), GPT (with the left-to-right decoder), and many other more recent pretraining schemes.


AI at Meta org


changed pull request status to merged

Sign up or log in to comment