emilios commited on
Commit
1303559
1 Parent(s): 761a953

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +13 -33
README.md CHANGED
@@ -3,39 +3,30 @@ language:
3
  - el
4
  license: apache-2.0
5
  tags:
6
- - whisper-event
 
7
  - generated_from_trainer
8
- - hf-asr-leaderboard
9
  datasets:
10
  - mozilla-foundation/common_voice_11_0
11
- - google/fleurs
12
- metrics:
13
- - wer
14
  model-index:
15
  - name: Whisper Medium El Greco Greek
16
- results:
17
- - task:
18
- name: Automatic Speech Recognition
19
- type: automatic-speech-recognition
20
- dataset:
21
- name: mozilla-foundation/common_voice_11_0
22
- type: mozilla-foundation/common_voice_11_0
23
- args: 'config: el, split: test'
24
- metrics:
25
- - name: Wer
26
- type: wer
27
- value: 13.976597325408619
28
  ---
29
 
30
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
31
  should probably proofread and complete it, then remove this comment. -->
32
 
33
- # Whisper Medium El - Greek One
34
 
35
  This model is a fine-tuned version of [openai/whisper-medium](https://huggingface.co/openai/whisper-medium) on the Common Voice 11.0 dataset.
36
  It achieves the following results on the evaluation set:
37
- - Loss: 0.4707
38
- - Wer: 13.9766
 
 
 
 
 
39
 
40
  ## Model description
41
 
@@ -55,8 +46,8 @@ More information needed
55
 
56
  The following hyperparameters were used during training:
57
  - learning_rate: 1e-05
58
- - train_batch_size: 20
59
- - eval_batch_size: 8
60
  - seed: 42
61
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
62
  - lr_scheduler_type: linear
@@ -64,17 +55,6 @@ The following hyperparameters were used during training:
64
  - training_steps: 5000
65
  - mixed_precision_training: Native AMP
66
 
67
- ### Training results
68
-
69
- | Training Loss | Epoch | Step | Validation Loss | Wer |
70
- |:-------------:|:-----:|:----:|:---------------:|:-------:|
71
- | 0.0036 | 10.01 | 1000 | 0.4461 | 15.9082 |
72
- | 0.0001 | 20.02 | 2000 | 0.4250 | 14.5245 |
73
- | 0.0 | 31.0 | 3000 | 0.4526 | 14.1902 |
74
- | 0.0 | 41.01 | 4000 | 0.4657 | 14.1252 |
75
- | 0.0 | 52.0 | 5000 | 0.4707 | 13.9766 |
76
-
77
-
78
  ### Framework versions
79
 
80
  - Transformers 4.26.0.dev0
 
3
  - el
4
  license: apache-2.0
5
  tags:
6
+ - hf-asr-leaderboard, whisper-medium, mozilla-foundation/common_voice_11_0, greek,
7
+ whisper-event
8
  - generated_from_trainer
 
9
  datasets:
10
  - mozilla-foundation/common_voice_11_0
 
 
 
11
  model-index:
12
  - name: Whisper Medium El Greco Greek
13
+ results: []
 
 
 
 
 
 
 
 
 
 
 
14
  ---
15
 
16
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
17
  should probably proofread and complete it, then remove this comment. -->
18
 
19
+ # Whisper Medium El Greco Greek
20
 
21
  This model is a fine-tuned version of [openai/whisper-medium](https://huggingface.co/openai/whisper-medium) on the Common Voice 11.0 dataset.
22
  It achieves the following results on the evaluation set:
23
+ - eval_loss: 0.3924
24
+ - eval_wer: 12.4443
25
+ - eval_runtime: 1211.1631
26
+ - eval_samples_per_second: 1.4
27
+ - eval_steps_per_second: 0.088
28
+ - epoch: 4.04
29
+ - step: 5000
30
 
31
  ## Model description
32
 
 
46
 
47
  The following hyperparameters were used during training:
48
  - learning_rate: 1e-05
49
+ - train_batch_size: 32
50
+ - eval_batch_size: 16
51
  - seed: 42
52
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
53
  - lr_scheduler_type: linear
 
55
  - training_steps: 5000
56
  - mixed_precision_training: Native AMP
57
 
 
 
 
 
 
 
 
 
 
 
 
58
  ### Framework versions
59
 
60
  - Transformers 4.26.0.dev0