File size: 4,950 Bytes
0da6e18
 
cd8e304
 
 
 
 
 
 
 
 
0da6e18
 
cd8e304
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
---
library_name: transformers
license: apache-2.0
base_model: facebook/wav2vec2-large-xlsr-53
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: xlsr-a-nomimose
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# xlsr-a-nomimose

This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4623
- Wer: 0.3473

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0004
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 132
- num_epochs: 100
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch   | Step  | Validation Loss | Wer    |
|:-------------:|:-------:|:-----:|:---------------:|:------:|
| 4.7536        | 1.8387  | 200   | 2.6165          | 1.0    |
| 1.8115        | 3.6728  | 400   | 0.6163          | 0.6689 |
| 0.469         | 5.5069  | 600   | 0.5160          | 0.5442 |
| 0.2745        | 7.3410  | 800   | 0.2917          | 0.4108 |
| 0.1694        | 9.1751  | 1000  | 0.3568          | 0.4624 |
| 0.1782        | 11.0092 | 1200  | 0.3223          | 0.3901 |
| 0.1406        | 12.8479 | 1400  | 0.2959          | 0.3923 |
| 0.1046        | 14.6820 | 1600  | 0.3131          | 0.3658 |
| 0.109         | 16.5161 | 1800  | 0.3343          | 0.3739 |
| 0.1135        | 18.3502 | 2000  | 0.3368          | 0.3577 |
| 0.0875        | 20.1843 | 2200  | 0.4722          | 0.3695 |
| 0.0956        | 22.0184 | 2400  | 0.3427          | 0.3614 |
| 0.0749        | 23.8571 | 2600  | 0.3377          | 0.3695 |
| 0.0727        | 25.6912 | 2800  | 0.3489          | 0.3599 |
| 0.0655        | 27.5253 | 3000  | 0.3348          | 0.3665 |
| 0.0603        | 29.3594 | 3200  | 0.3636          | 0.3606 |
| 0.0537        | 31.1935 | 3400  | 0.3923          | 0.3555 |
| 0.0567        | 33.0276 | 3600  | 0.3476          | 0.3555 |
| 0.0461        | 34.8664 | 3800  | 0.3589          | 0.3628 |
| 0.0459        | 36.7005 | 4000  | 0.4104          | 0.3584 |
| 0.0467        | 38.5346 | 4200  | 0.3686          | 0.3555 |
| 0.0419        | 40.3687 | 4400  | 0.3889          | 0.3555 |
| 0.0381        | 42.2028 | 4600  | 0.4013          | 0.3540 |
| 0.0403        | 44.0369 | 4800  | 0.4077          | 0.3555 |
| 0.0371        | 45.8756 | 5000  | 0.4502          | 0.3577 |
| 0.0354        | 47.7097 | 5200  | 0.4884          | 0.3739 |
| 0.0372        | 49.5438 | 5400  | 0.4227          | 0.3614 |
| 0.0344        | 51.3779 | 5600  | 0.3949          | 0.3532 |
| 0.0288        | 53.2120 | 5800  | 0.4088          | 0.3532 |
| 0.0314        | 55.0461 | 6000  | 0.4194          | 0.3496 |
| 0.0276        | 56.8848 | 6200  | 0.4184          | 0.3532 |
| 0.0232        | 58.7189 | 6400  | 0.4184          | 0.3496 |
| 0.0267        | 60.5530 | 6600  | 0.4028          | 0.3496 |
| 0.0271        | 62.3871 | 6800  | 0.3804          | 0.3488 |
| 0.0205        | 64.2212 | 7000  | 0.4735          | 0.3466 |
| 0.0243        | 66.0553 | 7200  | 0.4108          | 0.3473 |
| 0.0198        | 67.8940 | 7400  | 0.4157          | 0.3488 |
| 0.0193        | 69.7281 | 7600  | 0.4226          | 0.3547 |
| 0.0222        | 71.5622 | 7800  | 0.4147          | 0.3451 |
| 0.0176        | 73.3963 | 8000  | 0.4553          | 0.3488 |
| 0.0209        | 75.2304 | 8200  | 0.4135          | 0.3525 |
| 0.0222        | 77.0645 | 8400  | 0.4300          | 0.3481 |
| 0.0169        | 78.9032 | 8600  | 0.4139          | 0.3481 |
| 0.0174        | 80.7373 | 8800  | 0.4510          | 0.3473 |
| 0.019         | 82.5714 | 9000  | 0.4664          | 0.3488 |
| 0.0145        | 84.4055 | 9200  | 0.4768          | 0.3459 |
| 0.0146        | 86.2396 | 9400  | 0.4678          | 0.3466 |
| 0.0184        | 88.0737 | 9600  | 0.4906          | 0.3488 |
| 0.0141        | 89.9124 | 9800  | 0.4676          | 0.3481 |
| 0.0128        | 91.7465 | 10000 | 0.4612          | 0.3473 |
| 0.0134        | 93.5806 | 10200 | 0.4649          | 0.3459 |
| 0.0134        | 95.4147 | 10400 | 0.4606          | 0.3481 |
| 0.0126        | 97.2488 | 10600 | 0.4646          | 0.3488 |
| 0.014         | 99.0829 | 10800 | 0.4623          | 0.3473 |


### Framework versions

- Transformers 4.47.0.dev0
- Pytorch 2.4.0
- Datasets 3.0.1
- Tokenizers 0.20.0