patrickvonplaten commited on
Commit
aa49922
1 Parent(s): 8b6d010

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +89 -0
README.md ADDED
@@ -0,0 +1,89 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - automatic-speech-recognition
5
+ - multilingual_librispeech
6
+ - generated_from_trainer
7
+ datasets:
8
+ - multilingual_librispeech
9
+ model-index:
10
+ - name: wav2vec2-300m-mls-german-ft
11
+ results: []
12
+ ---
13
+
14
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
15
+ should probably proofread and complete it, then remove this comment. -->
16
+
17
+ # wav2vec2-300m-mls-german-ft
18
+
19
+ This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the MULTILINGUAL_LIBRISPEECH - GERMAN 10h dataset.
20
+ It achieves the following results on the evaluation set:
21
+ - Loss: 0.2398
22
+ - Wer: 0.1520
23
+
24
+ ## Model description
25
+
26
+ More information needed
27
+
28
+ ## Intended uses & limitations
29
+
30
+ More information needed
31
+
32
+ ## Training and evaluation data
33
+
34
+ More information needed
35
+
36
+ ## Training procedure
37
+
38
+ ### Training hyperparameters
39
+
40
+ The following hyperparameters were used during training:
41
+ - learning_rate: 0.0001
42
+ - train_batch_size: 32
43
+ - eval_batch_size: 8
44
+ - seed: 42
45
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
+ - lr_scheduler_type: linear
47
+ - lr_scheduler_warmup_steps: 1000
48
+ - num_epochs: 200.0
49
+ - mixed_precision_training: Native AMP
50
+
51
+ ### Training results
52
+
53
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
54
+ |:-------------:|:------:|:-----:|:---------------:|:------:|
55
+ | 3.0132 | 7.25 | 500 | 2.9393 | 1.0 |
56
+ | 2.9241 | 14.49 | 1000 | 2.8734 | 1.0 |
57
+ | 1.0766 | 21.74 | 1500 | 0.2773 | 0.2488 |
58
+ | 0.8416 | 28.99 | 2000 | 0.2224 | 0.1990 |
59
+ | 0.8048 | 36.23 | 2500 | 0.2063 | 0.1792 |
60
+ | 0.7664 | 43.48 | 3000 | 0.2088 | 0.1748 |
61
+ | 0.6571 | 50.72 | 3500 | 0.2042 | 0.1668 |
62
+ | 0.7014 | 57.97 | 4000 | 0.2136 | 0.1649 |
63
+ | 0.6171 | 65.22 | 4500 | 0.2139 | 0.1641 |
64
+ | 0.6609 | 72.46 | 5000 | 0.2144 | 0.1621 |
65
+ | 0.6318 | 79.71 | 5500 | 0.2129 | 0.1600 |
66
+ | 0.6222 | 86.96 | 6000 | 0.2124 | 0.1582 |
67
+ | 0.608 | 94.2 | 6500 | 0.2255 | 0.1639 |
68
+ | 0.6099 | 101.45 | 7000 | 0.2265 | 0.1622 |
69
+ | 0.6069 | 108.7 | 7500 | 0.2246 | 0.1593 |
70
+ | 0.5929 | 115.94 | 8000 | 0.2323 | 0.1617 |
71
+ | 0.6218 | 123.19 | 8500 | 0.2287 | 0.1566 |
72
+ | 0.5751 | 130.43 | 9000 | 0.2275 | 0.1563 |
73
+ | 0.5181 | 137.68 | 9500 | 0.2316 | 0.1579 |
74
+ | 0.6306 | 144.93 | 10000 | 0.2372 | 0.1556 |
75
+ | 0.5874 | 152.17 | 10500 | 0.2362 | 0.1533 |
76
+ | 0.5546 | 159.42 | 11000 | 0.2342 | 0.1543 |
77
+ | 0.6294 | 166.67 | 11500 | 0.2381 | 0.1536 |
78
+ | 0.5989 | 173.91 | 12000 | 0.2360 | 0.1527 |
79
+ | 0.5697 | 181.16 | 12500 | 0.2399 | 0.1526 |
80
+ | 0.5379 | 188.41 | 13000 | 0.2375 | 0.1523 |
81
+ | 0.5022 | 195.65 | 13500 | 0.2395 | 0.1519 |
82
+
83
+
84
+ ### Framework versions
85
+
86
+ - Transformers 4.13.0.dev0
87
+ - Pytorch 1.10.0
88
+ - Datasets 1.15.2.dev0
89
+ - Tokenizers 0.10.3