--- library_name: transformers license: cc-by-nc-4.0 base_model: facebook/mms-1b-all tags: - automatic-speech-recognition - BembaSpeech - mms - generated_from_trainer metrics: - wer model-index: - name: mms-1b-bem-genbed-all results: [] --- # mms-1b-bem-genbed-all This model is a fine-tuned version of [facebook/mms-1b-all](https://huggingface.co/facebook/mms-1b-all) on the BEMBASPEECH - BEM dataset. It achieves the following results on the evaluation set: - Loss: 0.2479 - Wer: 0.4062 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0003 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - num_epochs: 5.0 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:------:|:----:|:---------------:|:------:| | No log | 0.2747 | 200 | 0.6312 | 0.6736 | | No log | 0.5495 | 400 | 0.3199 | 0.4919 | | 2.9901 | 0.8242 | 600 | 0.3014 | 0.4613 | | 2.9901 | 1.0989 | 800 | 0.2825 | 0.4433 | | 0.3968 | 1.3736 | 1000 | 0.2783 | 0.4541 | | 0.3968 | 1.6484 | 1200 | 0.2732 | 0.4294 | | 0.3968 | 1.9231 | 1400 | 0.2649 | 0.4244 | | 0.3766 | 2.1978 | 1600 | 0.2621 | 0.4204 | | 0.3766 | 2.4725 | 1800 | 0.2628 | 0.4171 | | 0.3537 | 2.7473 | 2000 | 0.2579 | 0.4187 | | 0.3537 | 3.0220 | 2200 | 0.2557 | 0.4034 | | 0.3537 | 3.2967 | 2400 | 0.2524 | 0.4091 | | 0.3529 | 3.5714 | 2600 | 0.2535 | 0.4061 | | 0.3529 | 3.8462 | 2800 | 0.2495 | 0.4034 | | 0.3393 | 4.1209 | 3000 | 0.2494 | 0.4065 | | 0.3393 | 4.3956 | 3200 | 0.2488 | 0.4066 | | 0.3393 | 4.6703 | 3400 | 0.2482 | 0.4028 | | 0.3332 | 4.9451 | 3600 | 0.2479 | 0.4062 | ### Framework versions - Transformers 4.45.0.dev0 - Pytorch 2.4.0+cu121 - Datasets 2.21.0 - Tokenizers 0.19.1