DewiBrynJones commited on
Commit
53f7c3a
1 Parent(s): c1f25b2

Model save

Browse files
README.md CHANGED
@@ -2,8 +2,6 @@
2
  license: apache-2.0
3
  base_model: facebook/wav2vec2-large-xlsr-53
4
  tags:
5
- - automatic-speech-recognition
6
- - DewiBrynJones/banc-trawsgrifiadau-bangor-normalized
7
  - generated_from_trainer
8
  metrics:
9
  - wer
@@ -17,10 +15,10 @@ should probably proofread and complete it, then remove this comment. -->
17
 
18
  # wav2vec2-xlsr-53-ft-btb-cy
19
 
20
- This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on the DEWIBRYNJONES/BANC-TRAWSGRIFIADAU-BANGOR-NORMALIZED - DEFAULT dataset.
21
  It achieves the following results on the evaluation set:
22
- - Loss: 0.9669
23
- - Wer: 0.6125
24
 
25
  ## Model description
26
 
@@ -48,119 +46,38 @@ The following hyperparameters were used during training:
48
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
  - lr_scheduler_type: linear
50
  - lr_scheduler_warmup_steps: 500
51
- - num_epochs: 15.0
52
  - mixed_precision_training: Native AMP
53
 
54
  ### Training results
55
 
56
- | Training Loss | Epoch | Step | Validation Loss | Wer |
57
- |:-------------:|:-------:|:-----:|:---------------:|:------:|
58
- | No log | 0.1414 | 100 | 3.7427 | 1.0 |
59
- | No log | 0.2829 | 200 | 2.9179 | 1.0 |
60
- | No log | 0.4243 | 300 | 2.8036 | 1.0 |
61
- | No log | 0.5658 | 400 | 1.2196 | 0.8934 |
62
- | 3.574 | 0.7072 | 500 | 0.9860 | 0.7276 |
63
- | 3.574 | 0.8487 | 600 | 0.8392 | 0.6504 |
64
- | 3.574 | 0.9901 | 700 | 0.7804 | 0.6029 |
65
- | 3.574 | 1.1315 | 800 | 0.6122 | 0.4909 |
66
- | 3.574 | 1.2730 | 900 | 0.5901 | 0.4883 |
67
- | 0.811 | 1.4144 | 1000 | 0.5500 | 0.4508 |
68
- | 0.811 | 1.5559 | 1100 | 0.5232 | 0.4142 |
69
- | 0.811 | 1.6973 | 1200 | 0.5186 | 0.4065 |
70
- | 0.811 | 1.8388 | 1300 | 0.4953 | 0.3929 |
71
- | 0.811 | 1.9802 | 1400 | 0.4880 | 0.3928 |
72
- | 0.6459 | 2.1216 | 1500 | 0.4645 | 0.3692 |
73
- | 0.6459 | 2.2631 | 1600 | 0.4666 | 0.3586 |
74
- | 0.6459 | 2.4045 | 1700 | 0.4502 | 0.3593 |
75
- | 0.6459 | 2.5460 | 1800 | 0.4528 | 0.3638 |
76
- | 0.6459 | 2.6874 | 1900 | 0.4665 | 0.3926 |
77
- | 0.5306 | 2.8289 | 2000 | 0.4329 | 0.3505 |
78
- | 0.5306 | 2.9703 | 2100 | 0.4245 | 0.3374 |
79
- | 0.5306 | 3.1117 | 2200 | 0.4377 | 0.3340 |
80
- | 0.5306 | 3.2532 | 2300 | 0.4272 | 0.3337 |
81
- | 0.5306 | 3.3946 | 2400 | 0.4335 | 0.3326 |
82
- | 0.4628 | 3.5361 | 2500 | 0.4268 | 0.3275 |
83
- | 0.4628 | 3.6775 | 2600 | 0.4502 | 0.3409 |
84
- | 0.4628 | 3.8190 | 2700 | 0.6345 | 0.4390 |
85
- | 0.4628 | 3.9604 | 2800 | 1.0203 | 0.6403 |
86
- | 0.4628 | 4.1018 | 2900 | 1.2208 | 0.7922 |
87
- | 0.8685 | 4.2433 | 3000 | 1.1018 | 0.7387 |
88
- | 0.8685 | 4.3847 | 3100 | 1.2497 | 0.8062 |
89
- | 0.8685 | 4.5262 | 3200 | 1.6165 | 0.9616 |
90
- | 0.8685 | 4.6676 | 3300 | 1.4655 | 0.9217 |
91
- | 0.8685 | 4.8091 | 3400 | 1.0288 | 0.7465 |
92
- | 1.3918 | 4.9505 | 3500 | 0.9067 | 0.5948 |
93
- | 1.3918 | 5.0919 | 3600 | 0.9486 | 0.6353 |
94
- | 1.3918 | 5.2334 | 3700 | 0.8674 | 0.5428 |
95
- | 1.3918 | 5.3748 | 3800 | 0.9403 | 0.5793 |
96
- | 1.3918 | 5.5163 | 3900 | 0.9481 | 0.5764 |
97
- | 1.0402 | 5.6577 | 4000 | 1.0176 | 0.8257 |
98
- | 1.0402 | 5.7992 | 4100 | 0.9857 | 0.6343 |
99
- | 1.0402 | 5.9406 | 4200 | 1.3289 | 0.9014 |
100
- | 1.0402 | 6.0820 | 4300 | 2.0891 | 0.7125 |
101
- | 1.0402 | 6.2235 | 4400 | 1.2563 | 0.7696 |
102
- | 1.2886 | 6.3649 | 4500 | 1.1441 | 0.6927 |
103
- | 1.2886 | 6.5064 | 4600 | 1.0626 | 0.6573 |
104
- | 1.2886 | 6.6478 | 4700 | 0.9997 | 0.6423 |
105
- | 1.2886 | 6.7893 | 4800 | 0.9814 | 0.6380 |
106
- | 1.2886 | 6.9307 | 4900 | 1.0955 | 0.7651 |
107
- | 1.0984 | 7.0721 | 5000 | 0.9213 | 0.5883 |
108
- | 1.0984 | 7.2136 | 5100 | 0.8885 | 0.5933 |
109
- | 1.0984 | 7.3550 | 5200 | 0.9001 | 0.5899 |
110
- | 1.0984 | 7.4965 | 5300 | 0.8784 | 0.5859 |
111
- | 1.0984 | 7.6379 | 5400 | 0.9072 | 0.5898 |
112
- | 0.9659 | 7.7793 | 5500 | 0.8812 | 0.5841 |
113
- | 0.9659 | 7.9208 | 5600 | 0.8912 | 0.5855 |
114
- | 0.9659 | 8.0622 | 5700 | 0.8816 | 0.5807 |
115
- | 0.9659 | 8.2037 | 5800 | 0.8914 | 0.5803 |
116
- | 0.9659 | 8.3451 | 5900 | 0.8956 | 0.5810 |
117
- | 0.9679 | 8.4866 | 6000 | 0.9162 | 0.5780 |
118
- | 0.9679 | 8.6280 | 6100 | 0.9409 | 0.5810 |
119
- | 0.9679 | 8.7694 | 6200 | 0.9371 | 0.5781 |
120
- | 0.9679 | 8.9109 | 6300 | 0.9417 | 0.5790 |
121
- | 0.9679 | 9.0523 | 6400 | 0.9664 | 0.5784 |
122
- | 1.0241 | 9.1938 | 6500 | 0.9720 | 0.5775 |
123
- | 1.0241 | 9.3352 | 6600 | 0.9841 | 0.5784 |
124
- | 1.0241 | 9.4767 | 6700 | 0.9574 | 0.5887 |
125
- | 1.0241 | 9.6181 | 6800 | 1.0725 | 0.6068 |
126
- | 1.0241 | 9.7595 | 6900 | 1.0362 | 0.6000 |
127
- | 1.0797 | 9.9010 | 7000 | 1.0117 | 0.5914 |
128
- | 1.0797 | 10.0424 | 7100 | 0.9563 | 0.6058 |
129
- | 1.0797 | 10.1839 | 7200 | 0.9664 | 0.5978 |
130
- | 1.0797 | 10.3253 | 7300 | 1.0209 | 0.6022 |
131
- | 1.0797 | 10.4668 | 7400 | 0.9849 | 0.5975 |
132
- | 1.0701 | 10.6082 | 7500 | 0.9719 | 0.6057 |
133
- | 1.0701 | 10.7496 | 7600 | 0.9670 | 0.6123 |
134
- | 1.0701 | 10.8911 | 7700 | 0.9669 | 0.6125 |
135
- | 1.0701 | 11.0325 | 7800 | 0.9669 | 0.6125 |
136
- | 1.0701 | 11.1740 | 7900 | 0.9669 | 0.6125 |
137
- | 1.0518 | 11.3154 | 8000 | 0.9669 | 0.6125 |
138
- | 1.0518 | 11.4569 | 8100 | 0.9669 | 0.6125 |
139
- | 1.0518 | 11.5983 | 8200 | 0.9669 | 0.6125 |
140
- | 1.0518 | 11.7397 | 8300 | 0.9669 | 0.6125 |
141
- | 1.0518 | 11.8812 | 8400 | 0.9669 | 0.6125 |
142
- | 1.0594 | 12.0226 | 8500 | 0.9669 | 0.6125 |
143
- | 1.0594 | 12.1641 | 8600 | 0.9669 | 0.6125 |
144
- | 1.0594 | 12.3055 | 8700 | 0.9669 | 0.6125 |
145
- | 1.0594 | 12.4470 | 8800 | 0.9669 | 0.6125 |
146
- | 1.0594 | 12.5884 | 8900 | 0.9669 | 0.6125 |
147
- | 1.0584 | 12.7298 | 9000 | 0.9669 | 0.6125 |
148
- | 1.0584 | 12.8713 | 9100 | 0.9669 | 0.6125 |
149
- | 1.0584 | 13.0127 | 9200 | 0.9669 | 0.6125 |
150
- | 1.0584 | 13.1542 | 9300 | 0.9669 | 0.6125 |
151
- | 1.0584 | 13.2956 | 9400 | 0.9669 | 0.6125 |
152
- | 1.0556 | 13.4371 | 9500 | 0.9669 | 0.6125 |
153
- | 1.0556 | 13.5785 | 9600 | 0.9669 | 0.6125 |
154
- | 1.0556 | 13.7199 | 9700 | 0.9669 | 0.6125 |
155
- | 1.0556 | 13.8614 | 9800 | 0.9669 | 0.6125 |
156
- | 1.0556 | 14.0028 | 9900 | 0.9669 | 0.6125 |
157
- | 1.0511 | 14.1443 | 10000 | 0.9669 | 0.6125 |
158
- | 1.0511 | 14.2857 | 10100 | 0.9669 | 0.6125 |
159
- | 1.0511 | 14.4272 | 10200 | 0.9669 | 0.6125 |
160
- | 1.0511 | 14.5686 | 10300 | 0.9669 | 0.6125 |
161
- | 1.0511 | 14.7100 | 10400 | 0.9669 | 0.6125 |
162
- | 1.0585 | 14.8515 | 10500 | 0.9669 | 0.6125 |
163
- | 1.0585 | 14.9929 | 10600 | 0.9669 | 0.6125 |
164
 
165
 
166
  ### Framework versions
 
2
  license: apache-2.0
3
  base_model: facebook/wav2vec2-large-xlsr-53
4
  tags:
 
 
5
  - generated_from_trainer
6
  metrics:
7
  - wer
 
15
 
16
  # wav2vec2-xlsr-53-ft-btb-cy
17
 
18
+ This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on an unknown dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: 0.3847
21
+ - Wer: 0.2967
22
 
23
  ## Model description
24
 
 
46
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
  - lr_scheduler_type: linear
48
  - lr_scheduler_warmup_steps: 500
49
+ - training_steps: 2500
50
  - mixed_precision_training: Native AMP
51
 
52
  ### Training results
53
 
54
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
55
+ |:-------------:|:------:|:----:|:---------------:|:------:|
56
+ | No log | 0.1414 | 100 | 3.7464 | 1.0 |
57
+ | No log | 0.2829 | 200 | 2.9399 | 1.0 |
58
+ | No log | 0.4243 | 300 | 2.5961 | 0.9991 |
59
+ | No log | 0.5658 | 400 | 1.1619 | 0.7905 |
60
+ | 3.5448 | 0.7072 | 500 | 0.9466 | 0.6898 |
61
+ | 3.5448 | 0.8487 | 600 | 0.7895 | 0.6111 |
62
+ | 3.5448 | 0.9901 | 700 | 0.6820 | 0.5379 |
63
+ | 3.5448 | 1.1315 | 800 | 0.6039 | 0.4724 |
64
+ | 3.5448 | 1.2730 | 900 | 0.5631 | 0.4675 |
65
+ | 0.7808 | 1.4144 | 1000 | 0.5279 | 0.4291 |
66
+ | 0.7808 | 1.5559 | 1100 | 0.5024 | 0.3994 |
67
+ | 0.7808 | 1.6973 | 1200 | 0.4895 | 0.3895 |
68
+ | 0.7808 | 1.8388 | 1300 | 0.4596 | 0.3696 |
69
+ | 0.7808 | 1.9802 | 1400 | 0.4473 | 0.3611 |
70
+ | 0.6005 | 2.1216 | 1500 | 0.4332 | 0.3474 |
71
+ | 0.6005 | 2.2631 | 1600 | 0.4269 | 0.3418 |
72
+ | 0.6005 | 2.4045 | 1700 | 0.4155 | 0.3361 |
73
+ | 0.6005 | 2.5460 | 1800 | 0.4121 | 0.3214 |
74
+ | 0.6005 | 2.6874 | 1900 | 0.4145 | 0.3366 |
75
+ | 0.4666 | 2.8289 | 2000 | 0.3939 | 0.3114 |
76
+ | 0.4666 | 2.9703 | 2100 | 0.3889 | 0.3081 |
77
+ | 0.4666 | 3.1117 | 2200 | 0.3909 | 0.3064 |
78
+ | 0.4666 | 3.2532 | 2300 | 0.3874 | 0.3015 |
79
+ | 0.4666 | 3.3946 | 2400 | 0.3869 | 0.2983 |
80
+ | 0.3805 | 3.5361 | 2500 | 0.3847 | 0.2967 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
81
 
82
 
83
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:425848f79e9afbb7d4bbc2a8abb61314dfc85bf46cae59836f3c9994780b530e
3
  size 1262041180
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7979b4b8d82e5537f519481e94aae0b5714d4fd1eb26fff32beb44d21502612b
3
  size 1262041180
runs/May10_18-59-42_09e070d6a7b1/events.out.tfevents.1715364171.09e070d6a7b1.452.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:7c3a585483014e292c14b6e4bff8ab0b07ef4cf8b35ebb5563c0c44954f43530
3
- size 14895
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5a947d8a706447093356fccb64b115ef40d4a8a83cccb6088be8e3a1f92e84de
3
+ size 15778