arampacha commited on
Commit
7efa70b
1 Parent(s): 9d2583e

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +71 -0
README.md ADDED
@@ -0,0 +1,71 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - uk
4
+ license: apache-2.0
5
+ tags:
6
+ - whisper-event
7
+ - generated_from_trainer
8
+ datasets:
9
+ - mozilla-foundation/common_voice_11_0
10
+ - google/fleurs
11
+ model-index:
12
+ - name: whisper-base-uk
13
+ results:
14
+ - task:
15
+ name: Automatic Speech Recognition
16
+ type: automatic-speech-recognition
17
+ dataset:
18
+ name: Common Voice 11.0
19
+ type: mozilla-foundation/common_voice_11_0
20
+ config: hy-AM
21
+ split: test
22
+ args: hy-AM
23
+ metrics:
24
+ - name: Wer
25
+ type: wer
26
+ value: 10.286876675348378
27
+ ---
28
+
29
+
30
+ # whisper-base-uk
31
+
32
+ This model is a fine-tuned version of [openai/whisper-large-v2](https://huggingface.co/openai/whisper-large-v2) on the Common Voice 11.0 dataset.
33
+ It achieves the following results on the evaluation set:
34
+ - eval_loss: 1.3201
35
+ - eval_wer: 10.2869
36
+
37
+ ## Model description
38
+
39
+ More information needed
40
+
41
+ ## Intended uses & limitations
42
+
43
+ More information needed
44
+
45
+ ## Training and evaluation data
46
+
47
+ More information needed
48
+
49
+ ## Training procedure
50
+
51
+ ### Training hyperparameters
52
+
53
+ The following hyperparameters were used during training:
54
+ - learning_rate: 1e-05
55
+ - train_batch_size: 4
56
+ - eval_batch_size: 4
57
+ - seed: 42
58
+ - gradient_accumulation_steps: 16
59
+ - total_train_batch_size: 64
60
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
61
+ - lr_scheduler_type: linear
62
+ - lr_scheduler_warmup_steps: 500
63
+ - training_steps: 5000
64
+ - mixed_precision_training: Native AMP
65
+
66
+ ### Framework versions
67
+
68
+ - Transformers 4.26.0.dev0
69
+ - Pytorch 1.13.0+cu116
70
+ - Datasets 2.7.1.dev0
71
+ - Tokenizers 0.13.2