File size: 9,010 Bytes
f9e3f67
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
---
base_model: Lakoc/DeCRED_small_cv_2
tags:
- generated_from_trainer
datasets:
- common_voice_13_0
metrics:
- wer
model-index:
- name: DeCRED_small_cv_v2_scalar_mixing
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# DeCRED_small_cv_v2_scalar_mixing

This model is a fine-tuned version of [Lakoc/DeCRED_small_cv_2](https://huggingface.co/Lakoc/DeCRED_small_cv_2) on the common_voice_13_0 dataset.
It achieves the following results on the evaluation set:
- Loss: 2.0838
- Cer: 0.4056
- Wer: 0.6611
- Mer: 0.5983
- Wil: 0.8066
- Wip: 0.1934
- Hits: 20649
- Substitutions: 21863
- Deletions: 4009
- Insertions: 4882

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- distributed_type: multi-GPU
- num_devices: 2
- gradient_accumulation_steps: 2
- total_train_batch_size: 512
- total_eval_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50.0

### Training results

| Training Loss | Epoch | Step | Validation Loss | Cer     | Wer     | Mer    | Wil    | Wip    | Hits  | Substitutions | Deletions | Insertions |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:------:|:------:|:------:|:-----:|:-------------:|:---------:|:----------:|
| 6.9657        | 0.98  | 22   | 6.8485          | 59.9990 | 50.8263 | 0.9996 | 1.0000 | 0.0000 | 909   | 45605         | 7         | 2318877    |
| 6.585         | 2.0   | 45   | 6.6096          | 60.0578 | 50.5684 | 0.9996 | 1.0000 | 0.0000 | 990   | 45524         | 7         | 2306961    |
| 6.479         | 2.98  | 67   | 6.3875          | 59.6307 | 50.1500 | 0.9996 | 1.0000 | 0.0000 | 1046  | 45464         | 11        | 2287554    |
| 6.1764        | 4.0   | 90   | 6.1618          | 58.9125 | 49.4174 | 0.9995 | 1.0000 | 0.0000 | 1100  | 45410         | 11        | 2253527    |
| 6.0943        | 4.98  | 112  | 5.9524          | 57.6932 | 48.3373 | 0.9995 | 1.0000 | 0.0000 | 1191  | 45307         | 23        | 2203369    |
| 5.7875        | 6.0   | 135  | 5.7400          | 56.4049 | 47.1201 | 0.9994 | 1.0000 | 0.0000 | 1350  | 45145         | 26        | 2146902    |
| 5.7306        | 6.98  | 157  | 5.5433          | 55.1239 | 46.0142 | 0.9993 | 1.0000 | 0.0000 | 1449  | 45043         | 29        | 2095554    |
| 5.3893        | 8.0   | 180  | 5.3443          | 53.1765 | 44.2142 | 0.9992 | 1.0000 | 0.0000 | 1609  | 44876         | 36        | 2011978    |
| 5.355         | 8.98  | 202  | 5.1604          | 50.8395 | 42.2829 | 0.9991 | 1.0000 | 0.0000 | 1726  | 44743         | 52        | 1922250    |
| 5.0009        | 10.0  | 225  | 4.9748          | 47.9798 | 39.7937 | 0.9990 | 1.0000 | 0.0000 | 1921  | 44525         | 75        | 1806641    |
| 4.9572        | 10.98 | 247  | 4.8037          | 44.5128 | 36.9002 | 0.9988 | 0.9999 | 0.0001 | 2039  | 44377         | 105       | 1672151    |
| 4.7249        | 12.0  | 270  | 4.6314          | 40.9512 | 34.0044 | 0.9985 | 0.9999 | 0.0001 | 2306  | 44103         | 112       | 1537704    |
| 4.5906        | 12.98 | 292  | 4.4729          | 36.5890 | 30.3183 | 0.9982 | 0.9999 | 0.0001 | 2542  | 43829         | 150       | 1366459    |
| 4.3943        | 14.0  | 315  | 4.3138          | 32.2323 | 26.6965 | 0.9978 | 0.9999 | 0.0001 | 2765  | 43562         | 194       | 1198191    |
| 4.3164        | 14.98 | 337  | 4.1678          | 28.0206 | 23.2171 | 0.9972 | 0.9998 | 0.0002 | 3042  | 43206         | 273       | 1036603    |
| 4.1206        | 16.0  | 360  | 4.0216          | 23.6627 | 19.7857 | 0.9964 | 0.9997 | 0.0003 | 3295  | 42841         | 385       | 877224     |
| 4.0375        | 16.98 | 382  | 3.8879          | 19.1286 | 16.0039 | 0.9952 | 0.9996 | 0.0004 | 3622  | 42341         | 558       | 701619     |
| 3.8357        | 18.0  | 405  | 3.7545          | 15.4344 | 12.9968 | 0.9934 | 0.9994 | 0.0006 | 3991  | 41883         | 647       | 562094     |
| 3.7535        | 18.98 | 427  | 3.6327          | 11.6498 | 9.9882  | 0.9907 | 0.9991 | 0.0009 | 4356  | 41365         | 800       | 422498     |
| 3.5453        | 20.0  | 450  | 3.5116          | 9.0630  | 7.8157  | 0.9872 | 0.9987 | 0.0013 | 4705  | 40852         | 964       | 321778     |
| 3.5829        | 20.98 | 472  | 3.4015          | 6.9251  | 6.0536  | 0.9817 | 0.9979 | 0.0021 | 5240  | 40091         | 1190      | 240338     |
| 3.3805        | 22.0  | 495  | 3.2922          | 5.3260  | 4.7628  | 0.9754 | 0.9970 | 0.0030 | 5588  | 39576         | 1357      | 180635     |
| 3.3505        | 22.98 | 517  | 3.1933          | 3.7440  | 3.4844  | 0.9640 | 0.9953 | 0.0047 | 6051  | 38927         | 1543      | 121630     |
| 3.1605        | 24.0  | 540  | 3.0954          | 2.8455  | 2.7410  | 0.9505 | 0.9928 | 0.0072 | 6644  | 38101         | 1776      | 87636      |
| 3.1195        | 24.98 | 562  | 3.0071          | 1.8640  | 1.9752  | 0.9279 | 0.9887 | 0.0113 | 7141  | 37306         | 2074      | 52510      |
| 3.0494        | 26.0  | 585  | 2.9201          | 1.7379  | 1.8401  | 0.9164 | 0.9856 | 0.0144 | 7812  | 36313         | 2396      | 46896      |
| 3.0038        | 26.98 | 607  | 2.8418          | 1.3510  | 1.5287  | 0.8936 | 0.9800 | 0.0200 | 8466  | 35489         | 2566      | 33062      |
| 2.8702        | 28.0  | 630  | 2.7651          | 1.0949  | 1.3176  | 0.8699 | 0.9733 | 0.0267 | 9170  | 34524         | 2827      | 23947      |
| 2.8388        | 28.98 | 652  | 2.6963          | 0.9275  | 1.1758  | 0.8480 | 0.9664 | 0.0336 | 9802  | 33665         | 3054      | 17981      |
| 2.7353        | 30.0  | 675  | 2.6292          | 0.7931  | 1.0642  | 0.8230 | 0.9572 | 0.0428 | 10649 | 32622         | 3250      | 13637      |
| 2.694         | 30.98 | 697  | 2.5693          | 0.7406  | 1.0138  | 0.8049 | 0.9491 | 0.0509 | 11429 | 31693         | 3399      | 12071      |
| 2.63          | 32.0  | 720  | 2.5112          | 0.6716  | 0.9492  | 0.7835 | 0.9395 | 0.0605 | 12203 | 30845         | 3473      | 9839       |
| 2.5981        | 32.98 | 742  | 2.4596          | 0.6256  | 0.9070  | 0.7648 | 0.9298 | 0.0702 | 12979 | 29960         | 3582      | 8653       |
| 2.521         | 34.0  | 765  | 2.4099          | 0.5947  | 0.8741  | 0.7464 | 0.9192 | 0.0808 | 13813 | 29004         | 3704      | 7957       |
| 2.5005        | 34.98 | 787  | 2.3661          | 0.5715  | 0.8503  | 0.7324 | 0.9105 | 0.0895 | 14454 | 28259         | 3808      | 7488       |
| 2.4067        | 36.0  | 810  | 2.3241          | 0.5436  | 0.8208  | 0.7145 | 0.8991 | 0.1009 | 15259 | 27414         | 3848      | 6923       |
| 2.3969        | 36.98 | 832  | 2.2875          | 0.5252  | 0.8022  | 0.7004 | 0.8890 | 0.1110 | 15966 | 26612         | 3943      | 6765       |
| 2.3816        | 38.0  | 855  | 2.2529          | 0.5033  | 0.7763  | 0.6837 | 0.8772 | 0.1228 | 16710 | 25846         | 3965      | 6302       |
| 2.3544        | 38.98 | 877  | 2.2230          | 0.4892  | 0.7597  | 0.6709 | 0.8673 | 0.1327 | 17339 | 25191         | 3991      | 6159       |
| 2.2744        | 40.0  | 900  | 2.1951          | 0.4719  | 0.7402  | 0.6570 | 0.8565 | 0.1435 | 17977 | 24541         | 4003      | 5893       |
| 2.2653        | 40.98 | 922  | 2.1715          | 0.4591  | 0.7252  | 0.6454 | 0.8470 | 0.1530 | 18532 | 23960         | 4029      | 5747       |
| 2.2736        | 42.0  | 945  | 2.1500          | 0.4461  | 0.7092  | 0.6345 | 0.8382 | 0.1618 | 19009 | 23522         | 3990      | 5482       |
| 2.256         | 42.98 | 967  | 2.1324          | 0.4347  | 0.6953  | 0.6240 | 0.8294 | 0.1706 | 19489 | 23062         | 3970      | 5313       |
| 2.2187        | 44.0  | 990  | 2.1170          | 0.4265  | 0.6852  | 0.6164 | 0.8227 | 0.1773 | 19834 | 22682         | 4005      | 5187       |
| 2.2122        | 44.98 | 1012 | 2.1050          | 0.4187  | 0.6755  | 0.6094 | 0.8166 | 0.1834 | 20141 | 22360         | 4020      | 5044       |
| 2.2259        | 46.0  | 1035 | 2.0954          | 0.4126  | 0.6690  | 0.6045 | 0.8123 | 0.1877 | 20360 | 22162         | 3999      | 4963       |
| 2.2367        | 46.98 | 1057 | 2.0889          | 0.4087  | 0.6642  | 0.6007 | 0.8088 | 0.1912 | 20538 | 21978         | 4005      | 4916       |
| 2.1789        | 48.0  | 1080 | 2.0849          | 0.4062  | 0.6617  | 0.5988 | 0.8071 | 0.1929 | 20623 | 21886         | 4012      | 4887       |
| 2.0912        | 48.89 | 1100 | 2.0838          | 0.4056  | 0.6611  | 0.5983 | 0.8066 | 0.1934 | 20649 | 21863         | 4009      | 4882       |


### Framework versions

- Transformers 4.40.0.dev0
- Pytorch 2.2.0+rocm5.6
- Datasets 2.18.0
- Tokenizers 0.15.2