--- language: - multilingual - af - am - ar - as - az - be - bg - bn - br - bs - ca - cs - cy - da - de - el - en - eo - es - et - eu - fa - fi - fr - fy - ga - gd - gl - gu - ha - he - hi - hr - hu - hy - id - is - it - ja - jv - ka - kk - km - kn - ko - ku - ky - la - lo - lt - lv - mg - mk - ml - mn - mr - ms - my - ne - nl - no - om - or - pa - pl - ps - pt - ro - ru - sa - sd - si - sk - sl - so - sq - sr - su - sv - sw - ta - te - th - tl - tr - ug - uk - ur - uz - vi - xh - yi - zh --- # Tensorflow XLM-RoBERTa In this repository you will find different versions of the XLM-RoBERTa model for Tensorflow. ## XLM-RoBERTa [XLM-RoBERTa](https://ai.facebook.com/blog/-xlm-r-state-of-the-art-cross-lingual-understanding-through-self-supervision/) is a scaled cross lingual sentence encoder. It is trained on 2.5T of data across 100 languages data filtered from Common Crawl. XLM-R achieves state-of-the-arts results on multiple cross lingual benchmarks. ## Model Weights | Model | Downloads | -------------------------------- | --------------------------------------------------------------------------------------------------------------- | `jplu/tf-xlm-roberta-base` | [`config.json`](https://s3.amazonaws.com/models.huggingface.co/bert/jplu/tf-xlm-roberta-base/config.json) • [`tf_model.h5`](https://s3.amazonaws.com/models.huggingface.co/bert/jplu/tf-xlm-roberta-base/tf_model.h5) | `jplu/tf-xlm-roberta-large` | [`config.json`](https://s3.amazonaws.com/models.huggingface.co/bert/jplu/tf-xlm-roberta-large/config.json) • [`tf_model.h5`](https://s3.amazonaws.com/models.huggingface.co/bert/jplu/tf-xlm-roberta-large/tf_model.h5) ## Usage With Transformers >= 2.4 the Tensorflow models of XLM-RoBERTa can be loaded like: ```python from transformers import TFXLMRobertaModel model = TFXLMRobertaModel.from_pretrained("jplu/tf-xlm-roberta-base") ``` Or ``` model = TFXLMRobertaModel.from_pretrained("jplu/tf-xlm-roberta-large") ``` ## Huggingface model hub All models are available on the [Huggingface model hub](https://huggingface.co/jplu). ## Acknowledgments Thanks to all the Huggingface team for the support and their amazing library!