File size: 2,611 Bytes
f91bc3d 03f1081 f91bc3d 41359c4 a76a519 03f1081 6b58ffa b7e6213 54c68c7 f91bc3d |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 |
---
language:
- 'no'
license: apache-2.0
base_model: NbAiLabBeta/nb-whisper-large
tags:
- audio
- asr
- automatic-speech-recognition
- hf-asr-leaderboard
model-index:
- name: nb-whisper-large-v0.8-vad3-verbatim
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# nb-whisper-large-v0.8-vad3-verbatim
This model is a fine-tuned version of [NbAiLabBeta/nb-whisper-large](https://huggingface.co/NbAiLabBeta/nb-whisper-large) on the NbAiLab/NPSC dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7e-05
- lr_scheduler_type: linear
- per_device_train_batch_size: 8
- total_train_batch_size_per_node: 32
- total_train_batch_size: 1024
- total_optimization_steps: 250
- starting_optimization_step: None
- finishing_optimization_step: 250
- num_train_dataset_workers: 32
- num_hosts: 32
- total_num_training_examples: 256,000
- steps_per_epoch: 97
- num_beams: None
- weight_decay: 0.01
- adam_beta1: 0.9
- adam_beta2: 0.98
- adam_epsilon: 1e-06
- dropout: True
- bpe_dropout_probability: 0.2
- activation_dropout_probability: 0.1
### Training results
| step | validation_loss | train_loss | validation_wer | validation_cer | validation_exact_wer | validation_exact_cer |
|:----:|:---------------:|:----------:|:--------------:|:--------------:|:--------------------:|:--------------------:|
| 0 | 1.2831 | 1.1864 | 18.9083 | 11.8409 | 33.9801 | 15.0322 |
| 40 | 0.5952 | 0.4958 | 8.9760 | 2.9212 | 9.1099 | 2.9390 |
| 80 | 0.5848 | 0.4761 | 8.3105 | 2.6432 | 8.4330 | 2.6621 |
| 120 | 0.5831 | 0.4492 | 8.1204 | 2.5679 | 8.2356 | 2.5821 |
| 160 | 0.5811 | 0.4678 | 7.9302 | 2.5051 | 8.0438 | 2.5193 |
| 200 | 0.5840 | 0.4692 | 7.9861 | 2.5346 | 8.0945 | 2.5498 |
| 240 | 0.5844 | 0.4543 | 7.9246 | 2.5051 | 8.0381 | 2.5193 |
### Framework versions
- Transformers 4.36.2
- Datasets 2.16.1
- Tokenizers 0.15.0
|