Marissa commited on
Commit
ce151e2
1 Parent(s): d5f927b

Add model card

Browse files



@Meg



@Ezi

Files changed (1) hide show
  1. README.md +208 -0
README.md ADDED
@@ -0,0 +1,208 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - multilingual
4
+ - en
5
+ - es
6
+ - fr
7
+ - de
8
+ - zh
9
+ - ru
10
+ - pt
11
+ - it
12
+ - ar
13
+ - ja
14
+ - id
15
+ - tr
16
+ - nl
17
+ - pl
18
+ - fa
19
+ - vi
20
+ - sv
21
+ - ko
22
+ - he
23
+ - ro
24
+ - no
25
+ - hi
26
+ - uk
27
+ - cs
28
+ - fi
29
+ - hu
30
+ - th
31
+ - da
32
+ - ca
33
+ - el
34
+ - bg
35
+ - sr
36
+ - ms
37
+ - bn
38
+ - hr
39
+ - sl
40
+ - az
41
+ - sk
42
+ - eo
43
+ - ta
44
+ - sh
45
+ - lt
46
+ - et
47
+ - ml
48
+ - la
49
+ - bs
50
+ - sq
51
+ - arz
52
+ - af
53
+ - ka
54
+ - mr
55
+ - eu
56
+ - tl
57
+ - ang
58
+ - gl
59
+ - nn
60
+ - ur
61
+ - kk
62
+ - be
63
+ - hy
64
+ - te
65
+ - lv
66
+ - mk
67
+ - als
68
+ - is
69
+ - wuu
70
+ - my
71
+ - sco
72
+ - mn
73
+ - ceb
74
+ - ast
75
+ - cy
76
+ - kn
77
+ - br
78
+ - an
79
+ - gu
80
+ - bar
81
+ - uz
82
+ - lb
83
+ - ne
84
+ - si
85
+ - war
86
+ - jv
87
+ - ga
88
+ - oc
89
+ - ku
90
+ - sw
91
+ - nds
92
+ - ckb
93
+ - ia
94
+ - yi
95
+ - fy
96
+ - scn
97
+ - gan
98
+ - tt
99
+ - am
100
+ license: cc-by-nc-4.0
101
+ ---
102
+
103
+ # xlm-mlm-100-1280
104
+
105
+ # Table of Contents
106
+
107
+ 1. [Model Details](#model-details)
108
+ 2. [Uses](#uses)
109
+ 3. [Bias, Risks, and Limitations](#bias-risks-and-limitations)
110
+ 4. [Training](#training)
111
+ 5. [Evaluation](#evaluation)
112
+ 6. [Environmental Impact](#environmental-impact)
113
+ 7. [Citation](#citation)
114
+ 8. [Model Card Authors](#model-card-authors)
115
+ 9. [How To Get Started With the Model](#how-to-get-started-with-the-model)
116
+
117
+
118
+ # Model Details
119
+
120
+ xlm-mlm-100-1280 is the XLM model, which was proposed in [Cross-lingual Language Model Pretraining](https://arxiv.org/abs/1901.07291) by Guillaume Lample and Alexis Conneau, trained on Wikipedia text in 100 languages. The model is a transformer pretrained using a masked language modeling (MLM) objective.
121
+
122
+ ## Model Description
123
+
124
+ - **Developed by:** See [associated paper](https://arxiv.org/abs/1901.07291) and [GitHub Repo](https://github.com/facebookresearch/XLM)
125
+ - **Model type:** Language model
126
+ - **Language(s) (NLP):** 100 languages, see [GitHub Repo](https://github.com/facebookresearch/XLM#the-17-and-100-languages) for full list.
127
+ - **License:** CC-BY-NC-4.0
128
+ - **Related Models:** [xlm-mlm-17-1280](https://huggingface.co/xlm-mlm-17-1280)
129
+ - **Resources for more information:**
130
+ - [Associated paper](https://arxiv.org/abs/1901.07291)
131
+ - [GitHub Repo](https://github.com/facebookresearch/XLM#the-17-and-100-languages)
132
+ - [Hugging Face Multilingual Models for Inference docs](https://huggingface.co/docs/transformers/v4.20.1/en/multilingual#xlm-with-language-embeddings)
133
+
134
+ # Uses
135
+
136
+ ## Direct Use
137
+
138
+ The model is a language model. The model can be used for masked language modeling.
139
+
140
+ ## Downstream Use
141
+
142
+ To learn more about this task and potential downstream uses, see the Hugging Face [fill mask docs](https://huggingface.co/tasks/fill-mask) and the [Hugging Face Multilingual Models for Inference](https://huggingface.co/docs/transformers/v4.20.1/en/multilingual#xlm-with-language-embeddings) docs. Also see the [associated paper](https://arxiv.org/abs/1901.07291).
143
+
144
+ ## Out-of-Scope Use
145
+
146
+ The model should not be used to intentionally create hostile or alienating environments for people.
147
+
148
+ # Bias, Risks, and Limitations
149
+
150
+ Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)).
151
+
152
+ ## Recommendations
153
+
154
+ Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model.
155
+
156
+ # Training
157
+
158
+ This model is the XLM model trained on Wikipedia text in 100 languages. The preprocessing included tokenization and byte-pair-encoding. See the [GitHub repo](https://github.com/facebookresearch/XLM#the-17-and-100-languages) and the [associated paper](https://arxiv.org/pdf/1911.02116.pdf) for further details on the training data and training procedure.
159
+
160
+ # Evaluation
161
+
162
+ ## Testing Data, Factors & Metrics
163
+
164
+ The model developers evaluated the model on the XNLI cross-lingual classification task (see the [XNLI data card](https://huggingface.co/datasets/xnli) for more details on XNLI) using the metric of test accuracy. See the [GitHub Repo](https://arxiv.org/pdf/1911.02116.pdf) for further details on the testing data, factors and metrics.
165
+
166
+ ## Results
167
+
168
+ For xlm-mlm-100-1280, the test accuracy on the XNLI cross-lingual classification task in English (en), Spanish (es), German (de), Arabic (ar), Chinese (zh) and Urdu (ur) are:
169
+
170
+ |Language| en | es | de | ar | zh | ur |
171
+ |:------:|:--:|:---:|:--:|:--:|:--:|:--:|
172
+ | |83.7|76.6 |73.6|67.4|71.7|62.9|
173
+
174
+ See the [GitHub repo](https://github.com/facebookresearch/XLM#ii-cross-lingual-language-model-pretraining-xlm) for further details.
175
+
176
+ # Environmental Impact
177
+
178
+ Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
179
+
180
+ - **Hardware Type:** More information needed
181
+ - **Hours used:** More information needed
182
+ - **Cloud Provider:** More information needed
183
+ - **Compute Region:** More information needed
184
+ - **Carbon Emitted:** More information needed
185
+
186
+ # Citation
187
+
188
+ **BibTeX:**
189
+
190
+ ```bibtex
191
+ @article{lample2019cross,
192
+ title={Cross-lingual language model pretraining},
193
+ author={Lample, Guillaume and Conneau, Alexis},
194
+ journal={arXiv preprint arXiv:1901.07291},
195
+ year={2019}
196
+ }
197
+ ```
198
+
199
+ **APA:**
200
+ - Lample, G., & Conneau, A. (2019). Cross-lingual language model pretraining. arXiv preprint arXiv:1901.07291.
201
+
202
+ # Model Card Authors
203
+
204
+ This model card was written by the team at Hugging Face.
205
+
206
+ # How to Get Started with the Model
207
+
208
+ More information needed. See the [ipython notebook](https://github.com/facebookresearch/XLM/blob/main/generate-embeddings.ipynb) in the associated [GitHub repo](https://github.com/facebookresearch/XLM#the-17-and-100-languages) for examples.