4
This model is a fine-tuned version of openai/whisper-small on the None dataset. It achieves the following results on the evaluation set:
- Loss: 2.0909
- Wer: 65.6470
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-07
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 10
- num_epochs: 1
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
4.2263 | 0.04 | 10 | 4.0304 | 15.6470 |
3.8803 | 0.08 | 20 | 3.6564 | 15.4712 |
3.7847 | 0.13 | 30 | 3.2816 | 14.6624 |
3.4068 | 0.17 | 40 | 3.0934 | 14.3108 |
3.0547 | 0.21 | 50 | 2.9635 | 14.0647 |
3.0245 | 0.25 | 60 | 2.8523 | 17.7918 |
3.3518 | 0.29 | 70 | 2.7638 | 18.8467 |
3.105 | 0.34 | 80 | 2.6917 | 20.8509 |
2.8325 | 0.38 | 90 | 2.6234 | 21.3783 |
2.8667 | 0.42 | 100 | 2.5582 | 21.4838 |
2.7528 | 0.46 | 110 | 2.4994 | 21.2377 |
2.8502 | 0.5 | 120 | 2.4430 | 28.0591 |
2.5623 | 0.55 | 130 | 2.3898 | 30.5556 |
2.877 | 0.59 | 140 | 2.3409 | 36.0056 |
2.6492 | 0.63 | 150 | 2.2965 | 45.1828 |
2.4119 | 0.67 | 160 | 2.2554 | 60.0563 |
2.6273 | 0.71 | 170 | 2.2175 | 73.8045 |
2.4091 | 0.76 | 180 | 2.1841 | 70.2883 |
2.2931 | 0.8 | 190 | 2.1558 | 74.8242 |
2.297 | 0.84 | 200 | 2.1324 | 71.0970 |
2.1893 | 0.88 | 210 | 2.1133 | 72.0113 |
2.3408 | 0.92 | 220 | 2.0994 | 70.8861 |
2.2892 | 0.97 | 230 | 2.0909 | 65.6470 |
Framework versions
- Transformers 4.38.0.dev0
- Pytorch 2.1.0+cu121
- Datasets 2.16.1
- Tokenizers 0.15.1
- Downloads last month
- 3
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for RecCode/4
Base model
openai/whisper-small