bourdoiscatie commited on
Commit
b3e1791
1 Parent(s): 6045845

Add language tags

Browse files

Fix also a link.

Note that the link in the following quote don't work anymore:
> The code for the distillation process can be found [here](https://github.com/huggingface/transformers/tree/master/examples/distillation)

Files changed (1) hide show
  1. README.md +106 -2
README.md CHANGED
@@ -1,5 +1,109 @@
1
  ---
2
- language: multilingual
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
3
  license: apache-2.0
4
  datasets:
5
  - wikipedia
@@ -22,7 +126,7 @@ datasets:
22
 
23
  ## Model Description
24
 
25
- This model is a distilled version of the [BERT base multilingual model](bert-base-multilingual-cased). The code for the distillation process can be found
26
  [here](https://github.com/huggingface/transformers/tree/master/examples/distillation). This model is cased: it does make a difference between english and English.
27
 
28
  The model is trained on the concatenation of Wikipedia in 104 different languages listed [here](https://github.com/google-research/bert/blob/master/multilingual.md#list-of-languages).
 
1
  ---
2
+ language:
3
+ - multilingual
4
+ - af
5
+ - sq
6
+ - ar
7
+ - an
8
+ - hy
9
+ - ast
10
+ - az
11
+ - ba
12
+ - eu
13
+ - bar
14
+ - be
15
+ - bn
16
+ - inc
17
+ - bs
18
+ - br
19
+ - bg
20
+ - my
21
+ - ca
22
+ - ceb
23
+ - ce
24
+ - zh
25
+ - cv
26
+ - hr
27
+ - cs
28
+ - da
29
+ - nl
30
+ - en
31
+ - et
32
+ - fi
33
+ - fr
34
+ - gl
35
+ - ka
36
+ - de
37
+ - el
38
+ - gu
39
+ - ht
40
+ - he
41
+ - hi
42
+ - hu
43
+ - is
44
+ - io
45
+ - id
46
+ - ga
47
+ - it
48
+ - ja
49
+ - jv
50
+ - kn
51
+ - kk
52
+ - ky
53
+ - ko
54
+ - la
55
+ - lv
56
+ - lt
57
+ - roa
58
+ - nds
59
+ - lm
60
+ - mk
61
+ - mg
62
+ - ms
63
+ - ml
64
+ - mr
65
+ - mn
66
+ - min
67
+ - ne
68
+ - new
69
+ - nb
70
+ - nn
71
+ - oc
72
+ - fa
73
+ - pms
74
+ - pl
75
+ - pt
76
+ - pa
77
+ - ro
78
+ - ru
79
+ - sco
80
+ - sr
81
+ - hr
82
+ - scn
83
+ - sk
84
+ - sl
85
+ - aze
86
+ - es
87
+ - su
88
+ - sw
89
+ - sv
90
+ - tl
91
+ - tg
92
+ - th
93
+ - ta
94
+ - tt
95
+ - te
96
+ - tr
97
+ - uk
98
+ - ud
99
+ - uz
100
+ - vi
101
+ - vo
102
+ - war
103
+ - cy
104
+ - fry
105
+ - pnb
106
+ - yo
107
  license: apache-2.0
108
  datasets:
109
  - wikipedia
 
126
 
127
  ## Model Description
128
 
129
+ This model is a distilled version of the [BERT base multilingual model](https://huggingface.co/bert-base-multilingual-cased/). The code for the distillation process can be found
130
  [here](https://github.com/huggingface/transformers/tree/master/examples/distillation). This model is cased: it does make a difference between english and English.
131
 
132
  The model is trained on the concatenation of Wikipedia in 104 different languages listed [here](https://github.com/google-research/bert/blob/master/multilingual.md#list-of-languages).