ymoslem commited on
Commit
4fd18b3
1 Parent(s): 4de923f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -3
README.md CHANGED
@@ -16,6 +16,7 @@ datasets:
16
  metrics:
17
  - bleu
18
  - wer
 
19
  model-index:
20
  - name: Whisper Small GA-EN Speech Translation
21
  results:
@@ -23,7 +24,9 @@ model-index:
23
  name: Automatic Speech Recognition
24
  type: automatic-speech-recognition
25
  dataset:
26
- name: IWSLT-2023, FLEURS, BiteSize, SpokenWords, Tatoeba, and Wikimedia + augmented
 
 
27
  type: ymoslem/IWSLT2023-GA-EN
28
  metrics:
29
  - name: Bleu
@@ -69,8 +72,10 @@ The following hyperparameters were used during training:
69
  - seed: 42
70
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
71
  - lr_scheduler_type: linear
 
72
  - training_steps: 3000
73
  - mixed_precision_training: Native AMP
 
74
 
75
  ### Training results
76
 
@@ -96,7 +101,7 @@ The following hyperparameters were used during training:
96
  | 0.1946 | 0.7881 | 1800 | 1.2820 | 26.17 | 42.46 | 64.9257 |
97
  | 0.1588 | 0.8319 | 1900 | 1.3172 | 26.9 | 43.02 | 63.5299 |
98
  | 0.1322 | 0.8757 | 2000 | 1.3248 | 27.78 | 43.53 | 63.8001 |
99
- | 0.1134 | 0.9194 | 2100 | 1.3198 | 28.98 | 45.27 | 72.7600 |
100
  | 0.1031 | 0.9632 | 2200 | 1.3502 | 29.18 | 44.77 | 68.3476 |
101
  | 0.0518 | 1.0070 | 2300 | 1.3433 | 28.6 | 42.96 | 69.0230 |
102
  | 0.0481 | 1.0508 | 2400 | 1.3715 | 29.01 | 44.46 | 69.6983 |
@@ -113,4 +118,4 @@ The following hyperparameters were used during training:
113
  - Transformers 4.40.2
114
  - Pytorch 2.2.0+cu121
115
  - Datasets 2.19.1
116
- - Tokenizers 0.19.1
 
16
  metrics:
17
  - bleu
18
  - wer
19
+ - chrf
20
  model-index:
21
  - name: Whisper Small GA-EN Speech Translation
22
  results:
 
24
  name: Automatic Speech Recognition
25
  type: automatic-speech-recognition
26
  dataset:
27
+ name: >-
28
+ IWSLT-2023, FLEURS, BiteSize, SpokenWords, Tatoeba, and Wikimedia +
29
+ augmented
30
  type: ymoslem/IWSLT2023-GA-EN
31
  metrics:
32
  - name: Bleu
 
72
  - seed: 42
73
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
74
  - lr_scheduler_type: linear
75
+ - warmup_steps: 0
76
  - training_steps: 3000
77
  - mixed_precision_training: Native AMP
78
+ - generation_max_length: 128
79
 
80
  ### Training results
81
 
 
101
  | 0.1946 | 0.7881 | 1800 | 1.2820 | 26.17 | 42.46 | 64.9257 |
102
  | 0.1588 | 0.8319 | 1900 | 1.3172 | 26.9 | 43.02 | 63.5299 |
103
  | 0.1322 | 0.8757 | 2000 | 1.3248 | 27.78 | 43.53 | 63.8001 |
104
+ | 0.1134 | 0.9194 | **2100** | 1.3198 | 28.98 | 45.27 | 72.7600 |
105
  | 0.1031 | 0.9632 | 2200 | 1.3502 | 29.18 | 44.77 | 68.3476 |
106
  | 0.0518 | 1.0070 | 2300 | 1.3433 | 28.6 | 42.96 | 69.0230 |
107
  | 0.0481 | 1.0508 | 2400 | 1.3715 | 29.01 | 44.46 | 69.6983 |
 
118
  - Transformers 4.40.2
119
  - Pytorch 2.2.0+cu121
120
  - Datasets 2.19.1
121
+ - Tokenizers 0.19.1