lucas-meyer
commited on
Commit
•
eace147
1
Parent(s):
d8295c1
Update README.md
Browse files
README.md
CHANGED
@@ -2,7 +2,7 @@
|
|
2 |
tags:
|
3 |
- generated_from_trainer
|
4 |
datasets:
|
5 |
-
-
|
6 |
metrics:
|
7 |
- wer
|
8 |
model-index:
|
@@ -28,24 +28,10 @@ should probably proofread and complete it, then remove this comment. -->
|
|
28 |
|
29 |
# xls-r-fleurs_nl-run4
|
30 |
|
31 |
-
This model
|
32 |
-
It achieves the following results
|
33 |
-
-
|
34 |
-
- Wer:
|
35 |
-
|
36 |
-
## Model description
|
37 |
-
|
38 |
-
More information needed
|
39 |
-
|
40 |
-
## Intended uses & limitations
|
41 |
-
|
42 |
-
More information needed
|
43 |
-
|
44 |
-
## Training and evaluation data
|
45 |
-
|
46 |
-
More information needed
|
47 |
-
|
48 |
-
## Training procedure
|
49 |
|
50 |
### Training hyperparameters
|
51 |
|
@@ -64,7 +50,7 @@ The following hyperparameters were used during training:
|
|
64 |
|
65 |
### Training results
|
66 |
|
67 |
-
| Training Loss | Epoch | Step | Validation Loss | Wer |
|
68 |
|:-------------:|:-----:|:----:|:---------------:|:------:|
|
69 |
| 0.1216 | 1.55 | 100 | 0.5803 | 0.4294 |
|
70 |
| 0.0775 | 3.1 | 200 | 0.6325 | 0.4420 |
|
@@ -76,4 +62,4 @@ The following hyperparameters were used during training:
|
|
76 |
- Transformers 4.28.0
|
77 |
- Pytorch 2.0.1+cu117
|
78 |
- Datasets 2.14.4
|
79 |
-
- Tokenizers 0.13.3
|
|
|
2 |
tags:
|
3 |
- generated_from_trainer
|
4 |
datasets:
|
5 |
+
- google/fleurs
|
6 |
metrics:
|
7 |
- wer
|
8 |
model-index:
|
|
|
28 |
|
29 |
# xls-r-fleurs_nl-run4
|
30 |
|
31 |
+
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the FLEURS (nl) dataset.
|
32 |
+
It achieves the following results:
|
33 |
+
- Wer (Validation): 42.94%
|
34 |
+
- Wer (Test): 43.74%
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
35 |
|
36 |
### Training hyperparameters
|
37 |
|
|
|
50 |
|
51 |
### Training results
|
52 |
|
53 |
+
| Training Loss | Epoch | Step | Validation Loss | Wer (Train) |
|
54 |
|:-------------:|:-----:|:----:|:---------------:|:------:|
|
55 |
| 0.1216 | 1.55 | 100 | 0.5803 | 0.4294 |
|
56 |
| 0.0775 | 3.1 | 200 | 0.6325 | 0.4420 |
|
|
|
62 |
- Transformers 4.28.0
|
63 |
- Pytorch 2.0.1+cu117
|
64 |
- Datasets 2.14.4
|
65 |
+
- Tokenizers 0.13.3
|