File size: 7,813 Bytes
d7e87bf
fe8db32
 
 
 
 
 
 
 
 
d7e87bf
 
fe8db32
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
---
license: apache-2.0
base_model: facebook/wav2vec2-large
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: wav2vec2-large-sw-cv-20hr-v1
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

[<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/asr-africa-research-team/ASR%20Africa/runs/w0c4nymx)
# wav2vec2-large-sw-cv-20hr-v1

This model is a fine-tuned version of [facebook/wav2vec2-large](https://huggingface.co/facebook/wav2vec2-large) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: inf
- Model Preparation Time: 0.0059
- Wer: 0.3464
- Cer: 0.1302

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 500
- num_epochs: 100
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch   | Step  | Validation Loss | Model Preparation Time | Wer    | Cer    |
|:-------------:|:-------:|:-----:|:---------------:|:----------------------:|:------:|:------:|
| 4.0167        | 0.9976  | 210   | 1.3039          | 0.0059                 | 0.9301 | 0.3195 |
| 0.7784        | 2.0     | 421   | 0.7306          | 0.0059                 | 0.5818 | 0.1788 |
| 0.5359        | 2.9976  | 631   | 0.5886          | 0.0059                 | 0.5048 | 0.1517 |
| 0.427         | 4.0     | 842   | 0.5274          | 0.0059                 | 0.4460 | 0.1345 |
| 0.3657        | 4.9976  | 1052  | 0.5617          | 0.0059                 | 0.4620 | 0.1430 |
| 0.3219        | 6.0     | 1263  | 0.5162          | 0.0059                 | 0.408  | 0.1240 |
| 0.2922        | 6.9976  | 1473  | 0.4861          | 0.0059                 | 0.4074 | 0.1256 |
| 0.2681        | 8.0     | 1684  | 0.5076          | 0.0059                 | 0.404  | 0.1253 |
| 0.2459        | 8.9976  | 1894  | 0.5042          | 0.0059                 | 0.3915 | 0.1205 |
| 0.2332        | 10.0    | 2105  | 0.5051          | 0.0059                 | 0.3706 | 0.1120 |
| 0.2181        | 10.9976 | 2315  | 0.5370          | 0.0059                 | 0.3750 | 0.1149 |
| 0.2073        | 12.0    | 2526  | 0.5231          | 0.0059                 | 0.3860 | 0.1249 |
| 0.1982        | 12.9976 | 2736  | 0.5290          | 0.0059                 | 0.4045 | 0.1239 |
| 0.1875        | 14.0    | 2947  | 0.5184          | 0.0059                 | 0.3755 | 0.1153 |
| 0.1782        | 14.9976 | 3157  | 0.5215          | 0.0059                 | 0.3587 | 0.1100 |
| 0.1684        | 16.0    | 3368  | 0.5395          | 0.0059                 | 0.371  | 0.1142 |
| 0.1629        | 16.9976 | 3578  | 0.5499          | 0.0059                 | 0.3608 | 0.1101 |
| 0.1563        | 18.0    | 3789  | 0.5478          | 0.0059                 | 0.3577 | 0.1107 |
| 0.1516        | 18.9976 | 3999  | 0.5290          | 0.0059                 | 0.3649 | 0.1148 |
| 0.1431        | 20.0    | 4210  | 0.5765          | 0.0059                 | 0.3657 | 0.1167 |
| 0.1366        | 20.9976 | 4420  | 0.5604          | 0.0059                 | 0.3617 | 0.1137 |
| 0.1345        | 22.0    | 4631  | 0.5546          | 0.0059                 | 0.3604 | 0.1118 |
| 0.1303        | 22.9976 | 4841  | 0.5284          | 0.0059                 | 0.3511 | 0.1089 |
| 0.122         | 24.0    | 5052  | 0.5668          | 0.0059                 | 0.3555 | 0.1111 |
| 0.1183        | 24.9976 | 5262  | 0.5874          | 0.0059                 | 0.3521 | 0.1088 |
| 0.1151        | 26.0    | 5473  | 0.5539          | 0.0059                 | 0.3379 | 0.1044 |
| 0.1108        | 26.9976 | 5683  | 0.6110          | 0.0059                 | 0.3375 | 0.1051 |
| 0.1089        | 28.0    | 5894  | 0.5582          | 0.0059                 | 0.3397 | 0.1029 |
| 0.1064        | 28.9976 | 6104  | 0.5774          | 0.0059                 | 0.3432 | 0.1062 |
| 0.1026        | 30.0    | 6315  | 0.6042          | 0.0059                 | 0.3420 | 0.1062 |
| 0.0983        | 30.9976 | 6525  | 0.5793          | 0.0059                 | 0.3402 | 0.1046 |
| 0.0952        | 32.0    | 6736  | 0.6083          | 0.0059                 | 0.3423 | 0.1074 |
| 0.0927        | 32.9976 | 6946  | 0.6015          | 0.0059                 | 0.3363 | 0.1035 |
| 0.0895        | 34.0    | 7157  | 0.5790          | 0.0059                 | 0.3368 | 0.1041 |
| 0.0889        | 34.9976 | 7367  | 0.5530          | 0.0059                 | 0.3338 | 0.1023 |
| 0.0865        | 36.0    | 7578  | 0.5598          | 0.0059                 | 0.3267 | 0.1009 |
| 0.0828        | 36.9976 | 7788  | 0.5699          | 0.0059                 | 0.3249 | 0.1001 |
| 0.0814        | 38.0    | 7999  | 0.5756          | 0.0059                 | 0.3237 | 0.0996 |
| 0.0819        | 38.9976 | 8209  | 0.5878          | 0.0059                 | 0.3363 | 0.1052 |
| 0.077         | 40.0    | 8420  | 0.5852          | 0.0059                 | 0.3216 | 0.0984 |
| 0.075         | 40.9976 | 8630  | 0.5940          | 0.0059                 | 0.3295 | 0.1022 |
| 0.0725        | 42.0    | 8841  | 0.5779          | 0.0059                 | 0.3219 | 0.0997 |
| 0.0701        | 42.9976 | 9051  | 0.5962          | 0.0059                 | 0.3144 | 0.0965 |
| 0.0693        | 44.0    | 9262  | 0.6192          | 0.0059                 | 0.317  | 0.0975 |
| 0.0659        | 44.9976 | 9472  | 0.5989          | 0.0059                 | 0.3126 | 0.0964 |
| 0.0662        | 46.0    | 9683  | 0.6069          | 0.0059                 | 0.3112 | 0.0975 |
| 0.0646        | 46.9976 | 9893  | 0.6309          | 0.0059                 | 0.3164 | 0.0986 |
| 0.0626        | 48.0    | 10104 | 0.6266          | 0.0059                 | 0.3199 | 0.1007 |
| 0.062         | 48.9976 | 10314 | 0.6403          | 0.0059                 | 0.3116 | 0.0963 |
| 0.0591        | 50.0    | 10525 | 0.6140          | 0.0059                 | 0.3133 | 0.0965 |
| 0.0568        | 50.9976 | 10735 | 0.5947          | 0.0059                 | 0.3078 | 0.0950 |
| 0.0538        | 52.0    | 10946 | 0.6202          | 0.0059                 | 0.3029 | 0.0939 |
| 0.0544        | 52.9976 | 11156 | 0.6215          | 0.0059                 | 0.312  | 0.0966 |
| 0.0526        | 54.0    | 11367 | 0.6637          | 0.0059                 | 0.3093 | 0.0959 |
| 0.05          | 54.9976 | 11577 | 0.6513          | 0.0059                 | 0.3079 | 0.0955 |
| 0.0518        | 56.0    | 11788 | 0.6611          | 0.0059                 | 0.3070 | 0.0948 |
| 0.0493        | 56.9976 | 11998 | 0.6415          | 0.0059                 | 0.3041 | 0.0941 |
| 0.0482        | 58.0    | 12209 | 0.6386          | 0.0059                 | 0.3042 | 0.0939 |
| 0.0461        | 58.9976 | 12419 | 0.6664          | 0.0059                 | 0.316  | 0.0995 |
| 0.0445        | 60.0    | 12630 | 0.6472          | 0.0059                 | 0.3057 | 0.0963 |
| 0.0449        | 60.9976 | 12840 | 0.6510          | 0.0059                 | 0.3103 | 0.0972 |
| 0.0437        | 62.0    | 13051 | 0.6696          | 0.0059                 | 0.3166 | 0.1005 |


### Framework versions

- Transformers 4.43.1
- Pytorch 2.2.0+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1