roberta-large-mnli / README.md
julien-c's picture
julien-c HF staff
Migrate model card from transformers-repo
9d255b1
|
raw
history blame
No virus
583 Bytes
metadata
license: mit
widget:
  - text: I like you. </s></s> I love you.

roberta-large-mnli

Trained by Facebook, original source

@article{liu2019roberta,
    title = {RoBERTa: A Robustly Optimized BERT Pretraining Approach},
    author = {Yinhan Liu and Myle Ott and Naman Goyal and Jingfei Du and
              Mandar Joshi and Danqi Chen and Omer Levy and Mike Lewis and
              Luke Zettlemoyer and Veselin Stoyanov},
    journal={arXiv preprint arXiv:1907.11692},
    year = {2019},
}