Back to all models
fill-mask mask_token: <mask>
Query this model
🔥 This model is currently loaded and running on the Inference API. ⚠️ This model could not be loaded by the inference API. ⚠️ This model can be loaded on the Inference API on-demand.
JSON Output
API endpoint
								$ curl -X POST \
https://api-inference.huggingface.co/models/jplu/tf-xlm-roberta-base
Share Copied link to clipboard

Monthly model downloads

jplu/tf-xlm-roberta-base jplu/tf-xlm-roberta-base
9,187 downloads
last 30 days

pytorch

tf

Contributed by

jplu Julien Plu
8 models

How to use this model directly from the 🤗/transformers library:

			
Copy to clipboard
from transformers import AutoTokenizer, TFAutoModelWithLMHead tokenizer = AutoTokenizer.from_pretrained("jplu/tf-xlm-roberta-base") model = TFAutoModelWithLMHead.from_pretrained("jplu/tf-xlm-roberta-base")

Tensorflow XLM-RoBERTa

In this repository you will find different versions of the XLM-RoBERTa model for Tensorflow.

XLM-RoBERTa

XLM-RoBERTa is a scaled cross lingual sentence encoder. It is trained on 2.5T of data across 100 languages data filtered from Common Crawl. XLM-R achieves state-of-the-arts results on multiple cross lingual benchmarks.

Model Weights

Model Downloads
jplu/tf-xlm-roberta-base config.json • tf_model.h5
jplu/tf-xlm-roberta-large config.json • tf_model.h5

Usage

With Transformers >= 2.4 the Tensorflow models of XLM-RoBERTa can be loaded like:

from transformers import TFXLMRobertaModel

model = TFXLMRobertaModel.from_pretrained("jplu/tf-xlm-roberta-base")

Or

model = TFXLMRobertaModel.from_pretrained("jplu/tf-xlm-roberta-large")

Huggingface model hub

All models are available on the Huggingface model hub.

Acknowledgments

Thanks to all the Huggingface team for the support and their amazing library!