File size: 3,229 Bytes
586894d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
---
license: cc-by-nc-4.0
tags:
- generated_from_trainer
model-index:
- name: wav2vec2-base-fi-voxpopuli-v2-finetuned
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# wav2vec2-base-fi-voxpopuli-v2-finetuned

This model is a fine-tuned version of [facebook/wav2vec2-base-fi-voxpopuli-v2](https://huggingface.co/facebook/wav2vec2-base-fi-voxpopuli-v2) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1316
- Wer: 0.1498

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 10
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Wer    |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 1.575         | 0.33  | 500   | 0.7454          | 0.7048 |
| 0.5838        | 0.66  | 1000  | 0.2377          | 0.2608 |
| 0.5692        | 1.0   | 1500  | 0.2014          | 0.2244 |
| 0.5112        | 1.33  | 2000  | 0.1885          | 0.2013 |
| 0.4857        | 1.66  | 2500  | 0.1881          | 0.2120 |
| 0.4821        | 1.99  | 3000  | 0.1603          | 0.1894 |
| 0.4531        | 2.32  | 3500  | 0.1594          | 0.1865 |
| 0.4411        | 2.65  | 4000  | 0.1641          | 0.1874 |
| 0.4437        | 2.99  | 4500  | 0.1545          | 0.1874 |
| 0.4191        | 3.32  | 5000  | 0.1565          | 0.1770 |
| 0.4158        | 3.65  | 5500  | 0.1696          | 0.1867 |
| 0.4032        | 3.98  | 6000  | 0.1561          | 0.1746 |
| 0.4003        | 4.31  | 6500  | 0.1432          | 0.1749 |
| 0.4059        | 4.64  | 7000  | 0.1390          | 0.1690 |
| 0.4019        | 4.98  | 7500  | 0.1291          | 0.1646 |
| 0.3811        | 5.31  | 8000  | 0.1485          | 0.1755 |
| 0.3955        | 5.64  | 8500  | 0.1351          | 0.1659 |
| 0.3562        | 5.97  | 9000  | 0.1328          | 0.1614 |
| 0.3646        | 6.3   | 9500  | 0.1329          | 0.1584 |
| 0.351         | 6.64  | 10000 | 0.1342          | 0.1554 |
| 0.3408        | 6.97  | 10500 | 0.1422          | 0.1509 |
| 0.3562        | 7.3   | 11000 | 0.1309          | 0.1528 |
| 0.3335        | 7.63  | 11500 | 0.1305          | 0.1506 |
| 0.3491        | 7.96  | 12000 | 0.1365          | 0.1560 |
| 0.3538        | 8.29  | 12500 | 0.1293          | 0.1512 |
| 0.3338        | 8.63  | 13000 | 0.1328          | 0.1511 |
| 0.3509        | 8.96  | 13500 | 0.1304          | 0.1520 |
| 0.3431        | 9.29  | 14000 | 0.1360          | 0.1517 |
| 0.3309        | 9.62  | 14500 | 0.1328          | 0.1514 |
| 0.3252        | 9.95  | 15000 | 0.1316          | 0.1498 |


### Framework versions

- Transformers 4.19.1
- Pytorch 1.11.0+cu102
- Datasets 2.2.1
- Tokenizers 0.11.0