julien-c HF staff commited on
Commit
1a01b38
1 Parent(s): 785c07b

Migrate model card from transformers-repo

Browse files

Read announcement at https://discuss.huggingface.co/t/announcement-all-model-cards-will-be-migrated-to-hf-co-model-repos/2755
Original file history: https://github.com/huggingface/transformers/commits/master/model_cards/distilbert-base-multilingual-cased-README.md

Files changed (1) hide show
  1. README.md +35 -0
README.md ADDED
@@ -0,0 +1,35 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: multilingual
3
+ license: apache-2.0
4
+ datasets:
5
+ - wikipedia
6
+ ---
7
+
8
+ # DistilBERT base multilingual model (cased)
9
+
10
+ This model is a distilled version of the [BERT base multilingual model](bert-base-multilingual-cased). The code for the distillation process can be found
11
+ [here](https://github.com/huggingface/transformers/tree/master/examples/distillation). This model is cased: it does make a difference between english and English.
12
+
13
+ The model is trained on the concatenation of Wikipedia in 104 different languages listed [here](https://github.com/google-research/bert/blob/master/multilingual.md#list-of-languages).
14
+ The model has 6 layers, 768 dimension and 12 heads, totalizing 134M parameters (compared to 177M parameters for mBERT-base).
15
+ On average DistilmBERT is twice as fast as mBERT-base.
16
+
17
+ We encourage to check [BERT base multilingual model](bert-base-multilingual-cased) to know more about usage, limitations and potential biases.
18
+
19
+ | Model | English | Spanish | Chinese | German | Arabic | Urdu |
20
+ | :---: | :---: | :---: | :---: | :---: | :---: | :---:|
21
+ | mBERT base cased (computed) | 82.1 | 74.6 | 69.1 | 72.3 | 66.4 | 58.5 |
22
+ | mBERT base uncased (reported)| 81.4 | 74.3 | 63.8 | 70.5 | 62.1 | 58.3 |
23
+ | DistilmBERT | 78.2 | 69.1 | 64.0 | 66.3 | 59.1 | 54.7 |
24
+
25
+ ### BibTeX entry and citation info
26
+
27
+ ```bibtex
28
+ @article{Sanh2019DistilBERTAD,
29
+ title={DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter},
30
+ author={Victor Sanh and Lysandre Debut and Julien Chaumond and Thomas Wolf},
31
+ journal={ArXiv},
32
+ year={2019},
33
+ volume={abs/1910.01108}
34
+ }
35
+ ```