xlsr-czech / README.md
Badr Abdullah
Model save
6a16dbc verified
---
license: apache-2.0
base_model: facebook/wav2vec2-xls-r-300m
tags:
- generated_from_trainer
datasets:
- common_voice_17_0
metrics:
- wer
model-index:
- name: xlsr-czech
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: common_voice_17_0
type: common_voice_17_0
config: cs
split: validation
args: cs
metrics:
- name: Wer
type: wer
value: 0.17594345952554907
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/badr-nlp/xlsr-continual-finetuning-polish/runs/c0c7ty73)
# xlsr-czech
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the common_voice_17_0 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2040
- Wer: 0.1759
- Cer: 0.0387
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
|:-------------:|:------:|:----:|:---------------:|:------:|:------:|
| 5.9885 | 0.1589 | 100 | 4.5039 | 1.0 | 1.0 |
| 3.3721 | 0.3177 | 200 | 3.3649 | 1.0 | 1.0 |
| 3.4173 | 0.4766 | 300 | 3.2849 | 1.0000 | 0.9888 |
| 1.5357 | 0.6354 | 400 | 1.2694 | 0.9288 | 0.3102 |
| 0.9318 | 0.7943 | 500 | 0.6914 | 0.6972 | 0.1799 |
| 0.8233 | 0.9531 | 600 | 0.5004 | 0.5664 | 0.1342 |
| 0.3873 | 1.1120 | 700 | 0.3576 | 0.3941 | 0.0903 |
| 0.2506 | 1.2708 | 800 | 0.3360 | 0.3798 | 0.0861 |
| 0.2435 | 1.4297 | 900 | 0.2956 | 0.3328 | 0.0744 |
| 0.2828 | 1.5886 | 1000 | 0.2773 | 0.3156 | 0.0710 |
| 0.2454 | 1.7474 | 1100 | 0.2764 | 0.3138 | 0.0699 |
| 0.2607 | 1.9063 | 1200 | 0.2641 | 0.3013 | 0.0677 |
| 0.2389 | 2.0651 | 1300 | 0.2585 | 0.2911 | 0.0652 |
| 0.2457 | 2.2240 | 1400 | 0.2715 | 0.2923 | 0.0669 |
| 0.2602 | 2.3828 | 1500 | 0.2530 | 0.2763 | 0.0622 |
| 0.1614 | 2.5417 | 1600 | 0.2402 | 0.2651 | 0.0598 |
| 0.2206 | 2.7006 | 1700 | 0.2422 | 0.2807 | 0.0621 |
| 0.217 | 2.8594 | 1800 | 0.2356 | 0.2720 | 0.0598 |
| 0.093 | 3.0183 | 1900 | 0.2451 | 0.2628 | 0.0597 |
| 0.1327 | 3.1771 | 2000 | 0.2514 | 0.2710 | 0.0606 |
| 0.0989 | 3.3360 | 2100 | 0.2552 | 0.2754 | 0.0626 |
| 0.1772 | 3.4948 | 2200 | 0.2400 | 0.2691 | 0.0599 |
| 0.095 | 3.6537 | 2300 | 0.2423 | 0.2499 | 0.0568 |
| 0.1408 | 3.8125 | 2400 | 0.2392 | 0.2546 | 0.0591 |
| 0.1694 | 3.9714 | 2500 | 0.2372 | 0.2483 | 0.0564 |
| 0.0971 | 4.1303 | 2600 | 0.2224 | 0.2305 | 0.0512 |
| 0.1006 | 4.2891 | 2700 | 0.2128 | 0.2308 | 0.0516 |
| 0.1041 | 4.4480 | 2800 | 0.2229 | 0.2299 | 0.0518 |
| 0.1004 | 4.6068 | 2900 | 0.2282 | 0.2411 | 0.0536 |
| 0.0656 | 4.7657 | 3000 | 0.2302 | 0.2328 | 0.0520 |
| 0.101 | 4.9245 | 3100 | 0.2150 | 0.2229 | 0.0507 |
| 0.082 | 5.0834 | 3200 | 0.2314 | 0.2380 | 0.0541 |
| 0.0698 | 5.2423 | 3300 | 0.2291 | 0.2264 | 0.0512 |
| 0.0785 | 5.4011 | 3400 | 0.2198 | 0.2270 | 0.0505 |
| 0.0595 | 5.5600 | 3500 | 0.2153 | 0.2213 | 0.0490 |
| 0.0882 | 5.7188 | 3600 | 0.2154 | 0.2174 | 0.0487 |
| 0.0763 | 5.8777 | 3700 | 0.2245 | 0.2178 | 0.0484 |
| 0.0937 | 6.0365 | 3800 | 0.2250 | 0.2200 | 0.0494 |
| 0.0664 | 6.1954 | 3900 | 0.2147 | 0.2112 | 0.0473 |
| 0.0665 | 6.3542 | 4000 | 0.2122 | 0.2069 | 0.0464 |
| 0.0876 | 6.5131 | 4100 | 0.2173 | 0.2117 | 0.0476 |
| 0.0653 | 6.6720 | 4200 | 0.2088 | 0.2132 | 0.0474 |
| 0.0863 | 6.8308 | 4300 | 0.2181 | 0.2085 | 0.0470 |
| 0.088 | 6.9897 | 4400 | 0.2058 | 0.2049 | 0.0456 |
| 0.0721 | 7.1485 | 4500 | 0.2155 | 0.2048 | 0.0456 |
| 0.0474 | 7.3074 | 4600 | 0.2104 | 0.1992 | 0.0443 |
| 0.1084 | 7.4662 | 4700 | 0.2136 | 0.1972 | 0.0444 |
| 0.0695 | 7.6251 | 4800 | 0.2049 | 0.1922 | 0.0426 |
| 0.0463 | 7.7840 | 4900 | 0.2117 | 0.1900 | 0.0420 |
| 0.0485 | 7.9428 | 5000 | 0.2063 | 0.1886 | 0.0420 |
| 0.0401 | 8.1017 | 5100 | 0.2031 | 0.1871 | 0.0414 |
| 0.0355 | 8.2605 | 5200 | 0.2074 | 0.1886 | 0.0421 |
| 0.0543 | 8.4194 | 5300 | 0.2140 | 0.1878 | 0.0421 |
| 0.0364 | 8.5782 | 5400 | 0.2104 | 0.1869 | 0.0418 |
| 0.0334 | 8.7371 | 5500 | 0.2021 | 0.1827 | 0.0407 |
| 0.0588 | 8.8959 | 5600 | 0.1960 | 0.1808 | 0.0400 |
| 0.0793 | 9.0548 | 5700 | 0.1980 | 0.1793 | 0.0394 |
| 0.0597 | 9.2137 | 5800 | 0.1992 | 0.1774 | 0.0388 |
| 0.0662 | 9.3725 | 5900 | 0.2035 | 0.1767 | 0.0388 |
| 0.0352 | 9.5314 | 6000 | 0.2032 | 0.1773 | 0.0389 |
| 0.0391 | 9.6902 | 6100 | 0.2041 | 0.1767 | 0.0387 |
| 0.0357 | 9.8491 | 6200 | 0.2040 | 0.1759 | 0.0387 |
### Framework versions
- Transformers 4.42.0.dev0
- Pytorch 2.3.1+cu121
- Datasets 2.19.2
- Tokenizers 0.19.1