File size: 9,668 Bytes
03190f3
ce934e9
 
03190f3
 
ce934e9
 
03190f3
 
 
 
 
 
 
 
 
 
 
 
 
 
 
ce934e9
03190f3
ce934e9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
03190f3
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
3a97e0e
de071fe
03190f3
 
 
 
de071fe
03190f3
 
 
 
 
 
 
 
3a97e0e
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
03190f3
 
 
 
 
3a97e0e
03190f3
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
---
language:
- all
license: apache-2.0
tags:
- fleurs-lang_id
- google/xtreme_s
- generated_from_trainer
datasets:
- xtreme_s
metrics:
- accuracy
model-index:
- name: xtreme_s_xlsr_300m_fleurs_langid_truncated
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# xtreme_s_xlsr_300m_fleurs_langid_truncated

This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the GOOGLE/XTREME_S - FLEURS.ALL dataset.
It achieves the following results on the evaluation set:
- Epoch Af Za: 4.07
- Epoch Am Et: 4.07
- Epoch Ar Eg: 4.07
- Epoch As In: 4.07
- Epoch Ast Es: 4.07
- Epoch Az Az: 4.07
- Epoch Be By: 4.07
- Epoch Bn In: 4.07
- Epoch Bs Ba: 4.07
- Epoch Ca Es: 4.07
- Epoch Ceb Ph: 4.07
- Epoch Cmn Hans Cn: 4.07
- Epoch Cs Cz: 4.07
- Epoch Cy Gb: 4.07
- Epoch Da Dk: 4.07
- Epoch De De: 4.07
- Epoch El Gr: 4.07
- Epoch En Us: 4.07
- Epoch Es 419: 4.07
- Epoch Et Ee: 4.07
- Epoch Fa Ir: 4.07
- Epoch Ff Sn: 4.07
- Epoch Fi Fi: 4.07
- Epoch Fil Ph: 4.07
- Epoch Fr Fr: 4.07
- Epoch Ga Ie: 4.07
- Epoch Gl Es: 4.07
- Epoch Gu In: 4.07
- Epoch Ha Ng: 4.07
- Epoch He Il: 4.07
- Epoch Hi In: 4.07
- Epoch Hr Hr: 4.07
- Epoch Hu Hu: 4.07
- Epoch Hy Am: 4.07
- Epoch Id Id: 4.07
- Epoch Ig Ng: 4.07
- Epoch Is Is: 4.07
- Epoch It It: 4.07
- Epoch Ja Jp: 4.07
- Epoch Jv Id: 4.07
- Epoch Ka Ge: 4.07
- Epoch Kam Ke: 4.07
- Epoch Kea Cv: 4.07
- Epoch Kk Kz: 4.07
- Epoch Km Kh: 4.07
- Epoch Kn In: 4.07
- Epoch Ko Kr: 4.07
- Epoch Ku Arab Iq: 4.07
- Epoch Ky Kg: 4.07
- Epoch Lb Lu: 4.07
- Epoch Lg Ug: 4.07
- Epoch Ln Cd: 4.07
- Epoch Lo La: 4.07
- Epoch Lt Lt: 4.07
- Epoch Luo Ke: 4.07
- Epoch Lv Lv: 4.07
- Epoch Mi Nz: 4.07
- Epoch Mk Mk: 4.07
- Epoch Ml In: 4.07
- Epoch Mn Mn: 4.07
- Epoch Mr In: 4.07
- Epoch Ms My: 4.07
- Epoch Mt Mt: 4.07
- Epoch My Mm: 4.07
- Epoch Nb No: 4.07
- Epoch Ne Np: 4.07
- Epoch Nl Nl: 4.07
- Epoch Nso Za: 4.07
- Epoch Ny Mw: 4.07
- Epoch Oci Fr: 4.07
- Epoch Om Et: 4.07
- Epoch Or In: 4.07
- Epoch Pa In: 4.07
- Epoch Pl Pl: 4.07
- Epoch Ps Af: 4.07
- Epoch Pt Br: 4.07
- Epoch Ro Ro: 4.07
- Epoch Ru Ru: 4.07
- Epoch Rup Bg: 4.07
- Epoch Sd Arab In: 4.07
- Epoch Sk Sk: 4.07
- Epoch Sl Si: 4.07
- Epoch Sn Zw: 4.07
- Epoch So So: 4.07
- Epoch Sr Rs: 4.07
- Epoch Sv Se: 4.07
- Epoch Sw Ke: 4.07
- Epoch Ta In: 4.07
- Epoch Te In: 4.07
- Epoch Tg Tj: 4.07
- Epoch Th Th: 4.07
- Epoch Tr Tr: 4.07
- Epoch Uk Ua: 4.07
- Epoch Umb Ao: 4.07
- Epoch Ur Pk: 4.07
- Epoch Uz Uz: 4.07
- Epoch Vi Vn: 4.07
- Epoch Wo Sn: 4.07
- Epoch Xh Za: 4.07
- Epoch Yo Ng: 4.07
- Epoch Yue Hant Hk: 4.07
- Epoch Zu Za: 4.07
- Accuracy: 0.7271
- Accuracy Af Za: 0.3865
- Accuracy Am Et: 0.8818
- Accuracy Ar Eg: 0.9977
- Accuracy As In: 0.9858
- Accuracy Ast Es: 0.8362
- Accuracy Az Az: 0.8386
- Accuracy Be By: 0.4085
- Accuracy Bn In: 0.9989
- Accuracy Bs Ba: 0.2508
- Accuracy Ca Es: 0.6947
- Accuracy Ceb Ph: 0.9852
- Accuracy Cmn Hans Cn: 0.9799
- Accuracy Cs Cz: 0.5353
- Accuracy Cy Gb: 0.9716
- Accuracy Da Dk: 0.6688
- Accuracy De De: 0.7807
- Accuracy El Gr: 0.7692
- Accuracy En Us: 0.9815
- Accuracy Es 419: 0.9846
- Accuracy Et Ee: 0.5230
- Accuracy Fa Ir: 0.8462
- Accuracy Ff Sn: 0.2348
- Accuracy Fi Fi: 0.9978
- Accuracy Fil Ph: 0.9564
- Accuracy Fr Fr: 0.9852
- Accuracy Ga Ie: 0.8468
- Accuracy Gl Es: 0.5016
- Accuracy Gu In: 0.973
- Accuracy Ha Ng: 0.9163
- Accuracy He Il: 0.8043
- Accuracy Hi In: 0.9354
- Accuracy Hr Hr: 0.3654
- Accuracy Hu Hu: 0.8044
- Accuracy Hy Am: 0.9914
- Accuracy Id Id: 0.9869
- Accuracy Ig Ng: 0.9360
- Accuracy Is Is: 0.0217
- Accuracy It It: 0.8
- Accuracy Ja Jp: 0.7385
- Accuracy Jv Id: 0.5824
- Accuracy Ka Ge: 0.8611
- Accuracy Kam Ke: 0.4184
- Accuracy Kea Cv: 0.8692
- Accuracy Kk Kz: 0.8727
- Accuracy Km Kh: 0.7030
- Accuracy Kn In: 0.9630
- Accuracy Ko Kr: 0.9843
- Accuracy Ku Arab Iq: 0.9577
- Accuracy Ky Kg: 0.8936
- Accuracy Lb Lu: 0.8897
- Accuracy Lg Ug: 0.9253
- Accuracy Ln Cd: 0.9644
- Accuracy Lo La: 0.1580
- Accuracy Lt Lt: 0.4686
- Accuracy Luo Ke: 0.9922
- Accuracy Lv Lv: 0.6498
- Accuracy Mi Nz: 0.9613
- Accuracy Mk Mk: 0.7636
- Accuracy Ml In: 0.6962
- Accuracy Mn Mn: 0.8462
- Accuracy Mr In: 0.3911
- Accuracy Ms My: 0.3632
- Accuracy Mt Mt: 0.6188
- Accuracy My Mm: 0.9705
- Accuracy Nb No: 0.6891
- Accuracy Ne Np: 0.8994
- Accuracy Nl Nl: 0.9093
- Accuracy Nso Za: 0.8873
- Accuracy Ny Mw: 0.4691
- Accuracy Oci Fr: 0.1533
- Accuracy Om Et: 0.9512
- Accuracy Or In: 0.5447
- Accuracy Pa In: 0.8153
- Accuracy Pl Pl: 0.7757
- Accuracy Ps Af: 0.8105
- Accuracy Pt Br: 0.7715
- Accuracy Ro Ro: 0.4122
- Accuracy Ru Ru: 0.9794
- Accuracy Rup Bg: 0.9468
- Accuracy Sd Arab In: 0.5245
- Accuracy Sk Sk: 0.8624
- Accuracy Sl Si: 0.0300
- Accuracy Sn Zw: 0.8843
- Accuracy So So: 0.8803
- Accuracy Sr Rs: 0.0257
- Accuracy Sv Se: 0.0145
- Accuracy Sw Ke: 0.9199
- Accuracy Ta In: 0.9526
- Accuracy Te In: 0.9788
- Accuracy Tg Tj: 0.9883
- Accuracy Th Th: 0.9912
- Accuracy Tr Tr: 0.7887
- Accuracy Uk Ua: 0.0627
- Accuracy Umb Ao: 0.7863
- Accuracy Ur Pk: 0.0134
- Accuracy Uz Uz: 0.4014
- Accuracy Vi Vn: 0.7246
- Accuracy Wo Sn: 0.4555
- Accuracy Xh Za: 1.0
- Accuracy Yo Ng: 0.7353
- Accuracy Yue Hant Hk: 0.7985
- Accuracy Zu Za: 0.4696
- Loss: 1.3789
- Loss Af Za: 2.6778
- Loss Am Et: 0.4615
- Loss Ar Eg: 0.0149
- Loss As In: 0.0764
- Loss Ast Es: 0.4560
- Loss Az Az: 0.5677
- Loss Be By: 1.9231
- Loss Bn In: 0.0024
- Loss Bs Ba: 2.4954
- Loss Ca Es: 1.2632
- Loss Ceb Ph: 0.0426
- Loss Cmn Hans Cn: 0.0650
- Loss Cs Cz: 1.9334
- Loss Cy Gb: 0.1274
- Loss Da Dk: 1.4990
- Loss De De: 0.8820
- Loss El Gr: 0.9839
- Loss En Us: 0.0827
- Loss Es 419: 0.0516
- Loss Et Ee: 1.9264
- Loss Fa Ir: 0.6520
- Loss Ff Sn: 5.4283
- Loss Fi Fi: 0.0109
- Loss Fil Ph: 0.1706
- Loss Fr Fr: 0.0591
- Loss Ga Ie: 0.5174
- Loss Gl Es: 1.2657
- Loss Gu In: 0.0850
- Loss Ha Ng: 0.3234
- Loss He Il: 0.8299
- Loss Hi In: 0.4190
- Loss Hr Hr: 2.9754
- Loss Hu Hu: 0.8345
- Loss Hy Am: 0.0329
- Loss Id Id: 0.0529
- Loss Ig Ng: 0.2523
- Loss Is Is: 6.5153
- Loss It It: 0.8113
- Loss Ja Jp: 1.3968
- Loss Jv Id: 2.0009
- Loss Ka Ge: 0.6162
- Loss Kam Ke: 2.2192
- Loss Kea Cv: 0.5567
- Loss Kk Kz: 0.5592
- Loss Km Kh: 1.7358
- Loss Kn In: 0.1063
- Loss Ko Kr: 0.1519
- Loss Ku Arab Iq: 0.2075
- Loss Ky Kg: 0.4639
- Loss Lb Lu: 0.4454
- Loss Lg Ug: 0.3764
- Loss Ln Cd: 0.1844
- Loss Lo La: 3.8051
- Loss Lt Lt: 2.5054
- Loss Luo Ke: 0.0479
- Loss Lv Lv: 1.3713
- Loss Mi Nz: 0.1390
- Loss Mk Mk: 0.7952
- Loss Ml In: 1.2999
- Loss Mn Mn: 0.7621
- Loss Mr In: 3.7056
- Loss Ms My: 3.0192
- Loss Mt Mt: 1.5520
- Loss My Mm: 0.1514
- Loss Nb No: 1.1194
- Loss Ne Np: 0.4231
- Loss Nl Nl: 0.3291
- Loss Nso Za: 0.5106
- Loss Ny Mw: 2.7346
- Loss Oci Fr: 5.0983
- Loss Om Et: 0.2297
- Loss Or In: 2.5432
- Loss Pa In: 0.7753
- Loss Pl Pl: 0.7309
- Loss Ps Af: 1.0454
- Loss Pt Br: 0.9782
- Loss Ro Ro: 3.5829
- Loss Ru Ru: 0.0598
- Loss Rup Bg: 0.1695
- Loss Sd Arab In: 2.6198
- Loss Sk Sk: 0.5583
- Loss Sl Si: 6.0923
- Loss Sn Zw: 0.4465
- Loss So So: 0.4492
- Loss Sr Rs: 4.7575
- Loss Sv Se: 6.5858
- Loss Sw Ke: 0.4235
- Loss Ta In: 0.1818
- Loss Te In: 0.0808
- Loss Tg Tj: 0.0912
- Loss Th Th: 0.0462
- Loss Tr Tr: 0.7340
- Loss Uk Ua: 4.6777
- Loss Umb Ao: 1.4021
- Loss Ur Pk: 8.4067
- Loss Uz Uz: 4.3297
- Loss Vi Vn: 1.1304
- Loss Wo Sn: 2.2281
- Loss Xh Za: 0.0009
- Loss Yo Ng: 1.3345
- Loss Yue Hant Hk: 1.0728
- Loss Zu Za: 3.7279
- Predict Samples: 77960

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 8
- eval_batch_size: 1
- seed: 42
- distributed_type: multi-GPU
- num_devices: 8
- total_train_batch_size: 64
- total_eval_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 2000
- num_epochs: 5.0
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step  | Accuracy | Validation Loss |
|:-------------:|:-----:|:-----:|:--------:|:---------------:|
| 0.5296        | 0.26  | 1000  | 0.4016   | 2.6633          |
| 0.4252        | 0.52  | 2000  | 0.5751   | 1.8582          |
| 0.2989        | 0.78  | 3000  | 0.6332   | 1.6780          |
| 0.3563        | 1.04  | 4000  | 0.6799   | 1.4479          |
| 0.1617        | 1.3   | 5000  | 0.6679   | 1.5066          |
| 0.1409        | 1.56  | 6000  | 0.6992   | 1.4082          |
| 0.01          | 1.82  | 7000  | 0.7071   | 1.2448          |
| 0.0018        | 2.08  | 8000  | 0.7148   | 1.1996          |
| 0.0014        | 2.34  | 9000  | 0.6410   | 1.6505          |
| 0.0188        | 2.6   | 10000 | 0.6840   | 1.4050          |
| 0.0007        | 2.86  | 11000 | 0.6621   | 1.5831          |
| 0.1038        | 3.12  | 12000 | 0.6829   | 1.5441          |
| 0.0003        | 3.38  | 13000 | 0.6900   | 1.3483          |
| 0.0004        | 3.64  | 14000 | 0.6414   | 1.7070          |
| 0.0003        | 3.9   | 15000 | 0.7075   | 1.3198          |
| 0.0002        | 4.16  | 16000 | 0.7105   | 1.3118          |
| 0.0001        | 4.42  | 17000 | 0.7029   | 1.4099          |
| 0.0           | 4.68  | 18000 | 0.7180   | 1.3658          |
| 0.0001        | 4.93  | 19000 | 0.7236   | 1.3514          |


### Framework versions

- Transformers 4.18.0.dev0
- Pytorch 1.10.1+cu111
- Datasets 1.18.4.dev0
- Tokenizers 0.11.6