YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Tensorflow XLM-RoBERTa

In this repository you will find different versions of the XLM-RoBERTa model for Tensorflow.

XLM-RoBERTa

XLM-RoBERTa is a scaled cross lingual sentence encoder. It is trained on 2.5T of data across 100 languages data filtered from Common Crawl. XLM-R achieves state-of-the-arts results on multiple cross lingual benchmarks.

Model Weights

Model Downloads
jplu/tf-xlm-roberta-base config.json • tf_model.h5
jplu/tf-xlm-roberta-large config.json • tf_model.h5

Usage

With Transformers >= 2.4 the Tensorflow models of XLM-RoBERTa can be loaded like:

from transformers import TFXLMRobertaModel

model = TFXLMRobertaModel.from_pretrained("jplu/tf-xlm-roberta-base")

Or

model = TFXLMRobertaModel.from_pretrained("jplu/tf-xlm-roberta-large")

Huggingface model hub

All models are available on the Huggingface model hub.

Acknowledgments

Thanks to all the Huggingface team for the support and their amazing library!

Downloads last month
3,926
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.