BERT-large-cased pre-trained using RoBERTa's corpora (Wikipedia+Books+Stories+Newsroom+Openwebtext).