Badr Abdullah commited on
Commit
f5c1648
1 Parent(s): 1f45a71

Model save

Browse files
Files changed (1) hide show
  1. README.md +14 -15
README.md CHANGED
@@ -28,7 +28,7 @@ model-index:
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
29
  should probably proofread and complete it, then remove this comment. -->
30
 
31
- [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/badr-nlp/xlsr-continual-finetuning-new/runs/oduf7onr)
32
  # mHuBERT-147-upper-sorbian
33
 
34
  This model is a fine-tuned version of [utter-project/mHuBERT-147](https://huggingface.co/utter-project/mHuBERT-147) on the common_voice_17_0 dataset.
@@ -64,34 +64,33 @@ The following hyperparameters were used during training:
64
  - lr_scheduler_type: linear
65
  - lr_scheduler_warmup_steps: 500
66
  - num_epochs: 100
67
- - mixed_precision_training: Native AMP
68
 
69
  ### Training results
70
 
71
  | Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
72
  |:-------------:|:-------:|:----:|:---------------:|:---:|:---:|
73
- | 4.0738 | 3.9216 | 100 | 4.0797 | 1.0 | 1.0 |
74
- | 3.223 | 7.8431 | 200 | 3.2273 | 1.0 | 1.0 |
75
- | 3.1741 | 11.7647 | 300 | 3.2232 | 1.0 | 1.0 |
76
- | 3.2292 | 15.6863 | 400 | 3.2237 | 1.0 | 1.0 |
77
- | 3.2105 | 19.6078 | 500 | 3.2269 | 1.0 | 1.0 |
78
- | 3.1911 | 23.5294 | 600 | 3.2202 | 1.0 | 1.0 |
79
  | 3.2626 | 27.4510 | 700 | 3.2177 | 1.0 | 1.0 |
80
- | 3.21 | 31.3725 | 800 | 3.2232 | 1.0 | 1.0 |
81
  | 3.1871 | 35.2941 | 900 | 3.2211 | 1.0 | 1.0 |
82
- | 3.2224 | 39.2157 | 1000 | 3.2249 | 1.0 | 1.0 |
83
- | 3.2408 | 43.1373 | 1100 | 3.2215 | 1.0 | 1.0 |
84
  | 3.2 | 47.0588 | 1200 | 3.2193 | 1.0 | 1.0 |
85
  | 3.202 | 50.9804 | 1300 | 3.2181 | 1.0 | 1.0 |
86
  | 3.2286 | 54.9020 | 1400 | 3.2190 | 1.0 | 1.0 |
87
  | 3.1863 | 58.8235 | 1500 | 3.2187 | 1.0 | 1.0 |
88
- | 3.1868 | 62.7451 | 1600 | 3.2174 | 1.0 | 1.0 |
89
  | 3.226 | 66.6667 | 1700 | 3.2199 | 1.0 | 1.0 |
90
- | 3.1944 | 70.5882 | 1800 | 3.2195 | 1.0 | 1.0 |
91
  | 3.1997 | 74.5098 | 1900 | 3.2180 | 1.0 | 1.0 |
92
  | 3.2184 | 78.4314 | 2000 | 3.2200 | 1.0 | 1.0 |
93
- | 3.2252 | 82.3529 | 2100 | 3.2189 | 1.0 | 1.0 |
94
- | 3.208 | 86.2745 | 2200 | 3.2176 | 1.0 | 1.0 |
95
  | 3.2122 | 90.1961 | 2300 | 3.2170 | 1.0 | 1.0 |
96
  | 3.2307 | 94.1176 | 2400 | 3.2169 | 1.0 | 1.0 |
97
  | 3.1852 | 98.0392 | 2500 | 3.2172 | 1.0 | 1.0 |
 
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
29
  should probably proofread and complete it, then remove this comment. -->
30
 
31
+ [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/badr-nlp/xlsr-continual-finetuning-new/runs/mfm1kgzn)
32
  # mHuBERT-147-upper-sorbian
33
 
34
  This model is a fine-tuned version of [utter-project/mHuBERT-147](https://huggingface.co/utter-project/mHuBERT-147) on the common_voice_17_0 dataset.
 
64
  - lr_scheduler_type: linear
65
  - lr_scheduler_warmup_steps: 500
66
  - num_epochs: 100
 
67
 
68
  ### Training results
69
 
70
  | Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
71
  |:-------------:|:-------:|:----:|:---------------:|:---:|:---:|
72
+ | 3.8611 | 3.9216 | 100 | 3.8610 | 1.0 | 1.0 |
73
+ | 3.2233 | 7.8431 | 200 | 3.2270 | 1.0 | 1.0 |
74
+ | 3.1742 | 11.7647 | 300 | 3.2232 | 1.0 | 1.0 |
75
+ | 3.2292 | 15.6863 | 400 | 3.2238 | 1.0 | 1.0 |
76
+ | 3.2105 | 19.6078 | 500 | 3.2270 | 1.0 | 1.0 |
77
+ | 3.191 | 23.5294 | 600 | 3.2201 | 1.0 | 1.0 |
78
  | 3.2626 | 27.4510 | 700 | 3.2177 | 1.0 | 1.0 |
79
+ | 3.21 | 31.3725 | 800 | 3.2230 | 1.0 | 1.0 |
80
  | 3.1871 | 35.2941 | 900 | 3.2211 | 1.0 | 1.0 |
81
+ | 3.2221 | 39.2157 | 1000 | 3.2245 | 1.0 | 1.0 |
82
+ | 3.2408 | 43.1373 | 1100 | 3.2217 | 1.0 | 1.0 |
83
  | 3.2 | 47.0588 | 1200 | 3.2193 | 1.0 | 1.0 |
84
  | 3.202 | 50.9804 | 1300 | 3.2181 | 1.0 | 1.0 |
85
  | 3.2286 | 54.9020 | 1400 | 3.2190 | 1.0 | 1.0 |
86
  | 3.1863 | 58.8235 | 1500 | 3.2187 | 1.0 | 1.0 |
87
+ | 3.1868 | 62.7451 | 1600 | 3.2175 | 1.0 | 1.0 |
88
  | 3.226 | 66.6667 | 1700 | 3.2199 | 1.0 | 1.0 |
89
+ | 3.1944 | 70.5882 | 1800 | 3.2196 | 1.0 | 1.0 |
90
  | 3.1997 | 74.5098 | 1900 | 3.2180 | 1.0 | 1.0 |
91
  | 3.2184 | 78.4314 | 2000 | 3.2200 | 1.0 | 1.0 |
92
+ | 3.2252 | 82.3529 | 2100 | 3.2188 | 1.0 | 1.0 |
93
+ | 3.208 | 86.2745 | 2200 | 3.2175 | 1.0 | 1.0 |
94
  | 3.2122 | 90.1961 | 2300 | 3.2170 | 1.0 | 1.0 |
95
  | 3.2307 | 94.1176 | 2400 | 3.2169 | 1.0 | 1.0 |
96
  | 3.1852 | 98.0392 | 2500 | 3.2172 | 1.0 | 1.0 |