File size: 4,500 Bytes
c02a0e4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- common_voice
model-index:
- name: wav2vec2-xls-r-300m-gn-cv8
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# wav2vec2-xls-r-300m-gn-cv8

This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the common_voice dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0820
- Wer: 0.7212

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 1
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- training_steps: 5000
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch  | Step | Validation Loss | Wer    |
|:-------------:|:------:|:----:|:---------------:|:------:|
| 12.7853       | 2.76   | 100  | 4.7861          | 1.0    |
| 3.4153        | 5.55   | 200  | 3.5519          | 1.0    |
| 3.2923        | 8.33   | 300  | 3.3052          | 1.0    |
| 3.2119        | 11.11  | 400  | 3.1202          | 1.0    |
| 2.5099        | 13.87  | 500  | 1.6023          | 0.9872 |
| 1.3373        | 16.66  | 600  | 1.1878          | 0.9182 |
| 0.913         | 19.44  | 700  | 1.0049          | 0.8875 |
| 0.7013        | 22.22  | 800  | 0.9810          | 0.8542 |
| 0.5439        | 24.98  | 900  | 0.9463          | 0.8568 |
| 0.4581        | 27.76  | 1000 | 0.9771          | 0.8261 |
| 0.392         | 30.55  | 1100 | 0.9489          | 0.8389 |
| 0.3555        | 33.33  | 1200 | 0.8846          | 0.8107 |
| 0.3219        | 36.11  | 1300 | 0.8567          | 0.7980 |
| 0.2794        | 38.87  | 1400 | 0.8851          | 0.7775 |
| 0.2649        | 41.66  | 1500 | 0.9642          | 0.7954 |
| 0.2407        | 44.44  | 1600 | 0.9540          | 0.8133 |
| 0.2184        | 47.22  | 1700 | 0.8820          | 0.7494 |
| 0.2181        | 49.98  | 1800 | 0.9349          | 0.8031 |
| 0.1863        | 52.76  | 1900 | 0.9557          | 0.7494 |
| 0.1728        | 55.55  | 2000 | 1.0587          | 0.7519 |
| 0.1848        | 58.33  | 2100 | 1.0072          | 0.8056 |
| 0.1602        | 61.11  | 2200 | 0.9321          | 0.7980 |
| 0.1479        | 63.87  | 2300 | 0.9669          | 0.8005 |
| 0.1464        | 66.66  | 2400 | 0.9914          | 0.7545 |
| 0.1442        | 69.44  | 2500 | 1.0479          | 0.8184 |
| 0.1385        | 72.22  | 2600 | 1.0065          | 0.7647 |
| 0.1201        | 74.98  | 2700 | 0.9956          | 0.7801 |
| 0.1264        | 77.76  | 2800 | 1.0153          | 0.7801 |
| 0.1143        | 80.55  | 2900 | 0.9973          | 0.7826 |
| 0.1145        | 83.33  | 3000 | 0.9762          | 0.7698 |
| 0.1264        | 86.11  | 3100 | 0.9494          | 0.7391 |
| 0.1093        | 88.87  | 3200 | 1.0091          | 0.7801 |
| 0.0988        | 91.66  | 3300 | 1.0605          | 0.7621 |
| 0.103         | 94.44  | 3400 | 0.9910          | 0.7340 |
| 0.0972        | 97.22  | 3500 | 1.0412          | 0.7519 |
| 0.0974        | 99.98  | 3600 | 1.0361          | 0.7621 |
| 0.0836        | 102.76 | 3700 | 0.9969          | 0.7673 |
| 0.0795        | 105.55 | 3800 | 1.0198          | 0.7545 |
| 0.0839        | 108.33 | 3900 | 1.0269          | 0.7698 |
| 0.0856        | 111.11 | 4000 | 0.9913          | 0.7442 |
| 0.0721        | 113.87 | 4100 | 1.0239          | 0.7621 |
| 0.0711        | 116.66 | 4200 | 1.0360          | 0.7468 |
| 0.0771        | 119.44 | 4300 | 1.0799          | 0.7289 |
| 0.0624        | 122.22 | 4400 | 1.1323          | 0.7238 |
| 0.0748        | 124.98 | 4500 | 1.0868          | 0.7366 |
| 0.0644        | 127.76 | 4600 | 1.0658          | 0.7289 |
| 0.0667        | 130.55 | 4700 | 1.0731          | 0.7212 |
| 0.0624        | 133.33 | 4800 | 1.0794          | 0.7289 |
| 0.0714        | 136.11 | 4900 | 1.0832          | 0.7238 |
| 0.0627        | 138.87 | 5000 | 1.0820          | 0.7212 |


### Framework versions

- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.1
- Tokenizers 0.10.3