wav2vec2-large-mms-1b-korean-colab_v3
This model is a fine-tuned version of weekcircle/wav2vec2-large-mms-1b-korean-colab_v2 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.1476
- Wer: 0.3443
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 4
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
0.2374 | 0.18 | 100 | 0.1654 | 0.3761 |
0.2231 | 0.36 | 200 | 0.1648 | 0.3752 |
0.2263 | 0.53 | 300 | 0.1647 | 0.3859 |
0.2197 | 0.71 | 400 | 0.1618 | 0.3628 |
0.223 | 0.89 | 500 | 0.1642 | 0.3792 |
0.2143 | 1.07 | 600 | 0.1585 | 0.3684 |
0.2082 | 1.24 | 700 | 0.1589 | 0.3711 |
0.2166 | 1.42 | 800 | 0.1567 | 0.3647 |
0.2087 | 1.6 | 900 | 0.1561 | 0.3567 |
0.2109 | 1.78 | 1000 | 0.1551 | 0.3570 |
0.2036 | 1.95 | 1100 | 0.1553 | 0.3644 |
0.1926 | 2.13 | 1200 | 0.1545 | 0.3579 |
0.1972 | 2.31 | 1300 | 0.1539 | 0.3508 |
0.2086 | 2.49 | 1400 | 0.1526 | 0.3523 |
0.2179 | 2.66 | 1500 | 0.1524 | 0.3502 |
0.2036 | 2.84 | 1600 | 0.1515 | 0.3502 |
0.2196 | 3.02 | 1700 | 0.1510 | 0.3459 |
0.2149 | 3.2 | 1800 | 0.1498 | 0.3462 |
0.2111 | 3.37 | 1900 | 0.1485 | 0.3477 |
0.2043 | 3.55 | 2000 | 0.1481 | 0.3443 |
0.2043 | 3.73 | 2100 | 0.1475 | 0.3480 |
0.2018 | 3.91 | 2200 | 0.1476 | 0.3443 |
Framework versions
- Transformers 4.31.0.dev0
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3
- Downloads last month
- 3
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for weekcircle/wav2vec2-large-mms-1b-korean-colab_v3
Base model
facebook/mms-1b-l1107