--- language: - el license: apache-2.0 tags: - whisper-event - generated_from_trainer datasets: - mozilla-foundation/common_voice_11_0,google/fleurs metrics: - wer model-index: - name: Whisper small Greek Farsipal and El Greco results: - task: name: Automatic Speech Recognition type: automatic-speech-recognition dataset: name: mozilla-foundation/common_voice_11_0,google/fleurs el,el_gr type: mozilla-foundation/common_voice_11_0,google/fleurs config: el split: None metrics: - name: Wer type: wer value: 17.199108469539375 --- # Whisper small Greek Farsioal and El Greco This model is a fine-tuned version of [emilios/whisper-sm-el-farsipal-e4](https://huggingface.co/emilios/whisper-sm-el-farsipal-e4) on the mozilla-foundation/common_voice_11_0,google/fleurs el,el_gr dataset. It achieves the following results on the evaluation set: - Loss: 0.4871 - Wer: 17.1991 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-06 - train_batch_size: 32 - eval_batch_size: 16 - seed: 42 - distributed_type: multi-GPU - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - training_steps: 20000 ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:-----:|:---------------:|:-------:| | 0.1259 | 2.49 | 1000 | 0.4834 | 18.3692 | | 0.1002 | 4.49 | 2000 | 0.4604 | 17.8027 | | 0.1096 | 6.98 | 3000 | 0.4553 | 17.8770 | | 0.0885 | 9.46 | 4000 | 0.4551 | 17.9606 | | 0.0675 | 11.95 | 5000 | 0.4631 | 17.9049 | | 0.0675 | 14.44 | 6000 | 0.4619 | 17.9049 | | 0.0645 | 16.93 | 7000 | 0.4678 | 17.6727 | | 0.0535 | 19.41 | 8000 | 0.4685 | 17.6634 | | 0.039 | 21.49 | 9000 | 0.4746 | 17.6727 | | 0.0447 | 23.98 | 10000 | 0.4761 | 17.6634 | | 0.0393 | 26.46 | 11000 | 0.4792 | 17.7656 | | 0.0308 | 28.95 | 12000 | 0.4851 | 17.8678 | | 0.0301 | 31.44 | 13000 | 0.4846 | 17.4499 | | 0.031 | 33.93 | 14000 | 0.4849 | 17.8306 | | 0.0263 | 36.41 | 15000 | 0.4880 | 17.6170 | | 0.0256 | 38.9 | 16000 | 0.4871 | 17.1991 | | 0.0236 | 41.39 | 17000 | 0.4883 | 17.2641 | | 0.0195 | 43.88 | 18000 | 0.4880 | 17.5706 | | 0.0193 | 46.36 | 19000 | 0.4993 | 17.7285 | | 0.0161 | 48.85 | 20000 | 0.4968 | 17.8306 | ### Framework versions - Transformers 4.26.0.dev0 - Pytorch 2.0.0.dev20221216+cu116 - Datasets 2.7.1.dev0 - Tokenizers 0.13.2