Migrate model card from transformers-repo
Browse filesRead announcement at https://discuss.huggingface.co/t/announcement-all-model-cards-will-be-migrated-to-hf-co-model-repos/2755
Original file history: https://github.com/huggingface/transformers/commits/master/model_cards/jplu/tf-xlm-roberta-large/README.md
README.md
ADDED
@@ -0,0 +1,36 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# Tensorflow XLM-RoBERTa
|
2 |
+
|
3 |
+
In this repository you will find different versions of the XLM-RoBERTa model for Tensorflow.
|
4 |
+
|
5 |
+
## XLM-RoBERTa
|
6 |
+
|
7 |
+
[XLM-RoBERTa](https://ai.facebook.com/blog/-xlm-r-state-of-the-art-cross-lingual-understanding-through-self-supervision/) is a scaled cross lingual sentence encoder. It is trained on 2.5T of data across 100 languages data filtered from Common Crawl. XLM-R achieves state-of-the-arts results on multiple cross lingual benchmarks.
|
8 |
+
|
9 |
+
## Model Weights
|
10 |
+
|
11 |
+
| Model | Downloads
|
12 |
+
| -------------------------------- | ---------------------------------------------------------------------------------------------------------------
|
13 |
+
| `jplu/tf-xlm-roberta-base` | [`config.json`](https://s3.amazonaws.com/models.huggingface.co/bert/jplu/tf-xlm-roberta-base/config.json) • [`tf_model.h5`](https://s3.amazonaws.com/models.huggingface.co/bert/jplu/tf-xlm-roberta-base/tf_model.h5)
|
14 |
+
| `jplu/tf-xlm-roberta-large` | [`config.json`](https://s3.amazonaws.com/models.huggingface.co/bert/jplu/tf-xlm-roberta-large/config.json) • [`tf_model.h5`](https://s3.amazonaws.com/models.huggingface.co/bert/jplu/tf-xlm-roberta-large/tf_model.h5)
|
15 |
+
|
16 |
+
## Usage
|
17 |
+
|
18 |
+
With Transformers >= 2.4 the Tensorflow models of XLM-RoBERTa can be loaded like:
|
19 |
+
|
20 |
+
```python
|
21 |
+
from transformers import TFXLMRobertaModel
|
22 |
+
|
23 |
+
model = TFXLMRobertaModel.from_pretrained("jplu/tf-xlm-roberta-base")
|
24 |
+
```
|
25 |
+
Or
|
26 |
+
```
|
27 |
+
model = TFXLMRobertaModel.from_pretrained("jplu/tf-xlm-roberta-large")
|
28 |
+
```
|
29 |
+
|
30 |
+
## Huggingface model hub
|
31 |
+
|
32 |
+
All models are available on the [Huggingface model hub](https://huggingface.co/jplu).
|
33 |
+
|
34 |
+
## Acknowledgments
|
35 |
+
|
36 |
+
Thanks to all the Huggingface team for the support and their amazing library!
|