JBZhang2342 commited on
Commit
6f7698e
1 Parent(s): 10bd194

Model save

Browse files
README.md CHANGED
@@ -1,26 +1,21 @@
1
  ---
2
- language:
3
- - en
4
  license: mit
5
  base_model: microsoft/speecht5_tts
6
  tags:
7
- - en_accent,mozilla,t5,common_voice_1_0
8
  - generated_from_trainer
9
- datasets:
10
- - mozilla-foundation/common_voice_1_0
11
  model-index:
12
- - name: SpeechT5 TTS English Accented
13
  results: []
14
  ---
15
 
16
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
17
  should probably proofread and complete it, then remove this comment. -->
18
 
19
- # SpeechT5 TTS English Accented
20
 
21
- This model is a fine-tuned version of [microsoft/speecht5_tts](https://huggingface.co/microsoft/speecht5_tts) on the Common Voice dataset.
22
  It achieves the following results on the evaluation set:
23
- - Loss: 0.6228
24
 
25
  ## Model description
26
 
@@ -46,133 +41,53 @@ The following hyperparameters were used during training:
46
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
  - lr_scheduler_type: linear
48
  - lr_scheduler_warmup_steps: 500
49
- - training_steps: 30000
50
  - mixed_precision_training: Native AMP
51
 
52
  ### Training results
53
 
54
- | Training Loss | Epoch | Step | Validation Loss |
55
- |:-------------:|:------:|:-----:|:---------------:|
56
- | No log | 3.85 | 250 | 0.5310 |
57
- | 0.6287 | 7.69 | 500 | 0.5088 |
58
- | 0.6287 | 11.54 | 750 | 0.4855 |
59
- | 0.5138 | 15.38 | 1000 | 0.4986 |
60
- | 0.5138 | 19.23 | 1250 | 0.4820 |
61
- | 0.4735 | 23.08 | 1500 | 0.4775 |
62
- | 0.4735 | 26.92 | 1750 | 0.5104 |
63
- | 0.4512 | 30.77 | 2000 | 0.4953 |
64
- | 0.4512 | 34.62 | 2250 | 0.4838 |
65
- | 0.4419 | 38.46 | 2500 | 0.4969 |
66
- | 0.4419 | 42.31 | 2750 | 0.5057 |
67
- | 0.4313 | 46.15 | 3000 | 0.4931 |
68
- | 0.4313 | 50.0 | 3250 | 0.4975 |
69
- | 0.4164 | 53.85 | 3500 | 0.5145 |
70
- | 0.4164 | 57.69 | 3750 | 0.5070 |
71
- | 0.4055 | 61.54 | 4000 | 0.4921 |
72
- | 0.4055 | 65.38 | 4250 | 0.5139 |
73
- | 0.3999 | 69.23 | 4500 | 0.5111 |
74
- | 0.3999 | 73.08 | 4750 | 0.5118 |
75
- | 0.3895 | 76.92 | 5000 | 0.5184 |
76
- | 0.3895 | 80.77 | 5250 | 0.5246 |
77
- | 0.3843 | 84.62 | 5500 | 0.5244 |
78
- | 0.3843 | 88.46 | 5750 | 0.5252 |
79
- | 0.3731 | 92.31 | 6000 | 0.5092 |
80
- | 0.3731 | 96.15 | 6250 | 0.5098 |
81
- | 0.3698 | 100.0 | 6500 | 0.5357 |
82
- | 0.3698 | 103.85 | 6750 | 0.5315 |
83
- | 0.363 | 107.69 | 7000 | 0.5297 |
84
- | 0.363 | 111.54 | 7250 | 0.5429 |
85
- | 0.358 | 115.38 | 7500 | 0.5418 |
86
- | 0.358 | 119.23 | 7750 | 0.5483 |
87
- | 0.3539 | 123.08 | 8000 | 0.5449 |
88
- | 0.3539 | 126.92 | 8250 | 0.5466 |
89
- | 0.3503 | 130.77 | 8500 | 0.5505 |
90
- | 0.3503 | 134.62 | 8750 | 0.5402 |
91
- | 0.346 | 138.46 | 9000 | 0.5372 |
92
- | 0.346 | 142.31 | 9250 | 0.5547 |
93
- | 0.3421 | 146.15 | 9500 | 0.5650 |
94
- | 0.3421 | 150.0 | 9750 | 0.5544 |
95
- | 0.3376 | 153.85 | 10000 | 0.5594 |
96
- | 0.3376 | 157.69 | 10250 | 0.5624 |
97
- | 0.3331 | 161.54 | 10500 | 0.5574 |
98
- | 0.3331 | 165.38 | 10750 | 0.5605 |
99
- | 0.3285 | 169.23 | 11000 | 0.5710 |
100
- | 0.3285 | 173.08 | 11250 | 0.5671 |
101
- | 0.3253 | 176.92 | 11500 | 0.5561 |
102
- | 0.3253 | 180.77 | 11750 | 0.5677 |
103
- | 0.3233 | 184.62 | 12000 | 0.5841 |
104
- | 0.3233 | 188.46 | 12250 | 0.5770 |
105
- | 0.3203 | 192.31 | 12500 | 0.5705 |
106
- | 0.3203 | 196.15 | 12750 | 0.5642 |
107
- | 0.317 | 200.0 | 13000 | 0.5830 |
108
- | 0.317 | 203.85 | 13250 | 0.5800 |
109
- | 0.3132 | 207.69 | 13500 | 0.5833 |
110
- | 0.3132 | 211.54 | 13750 | 0.5658 |
111
- | 0.31 | 215.38 | 14000 | 0.5874 |
112
- | 0.31 | 219.23 | 14250 | 0.5911 |
113
- | 0.3084 | 223.08 | 14500 | 0.5907 |
114
- | 0.3084 | 226.92 | 14750 | 0.5982 |
115
- | 0.3046 | 230.77 | 15000 | 0.5962 |
116
- | 0.3046 | 234.62 | 15250 | 0.5846 |
117
- | 0.3003 | 238.46 | 15500 | 0.5886 |
118
- | 0.3003 | 242.31 | 15750 | 0.6019 |
119
- | 0.2995 | 246.15 | 16000 | 0.6022 |
120
- | 0.2995 | 250.0 | 16250 | 0.5986 |
121
- | 0.2985 | 253.85 | 16500 | 0.5994 |
122
- | 0.2985 | 257.69 | 16750 | 0.5967 |
123
- | 0.2925 | 261.54 | 17000 | 0.5928 |
124
- | 0.2925 | 265.38 | 17250 | 0.6138 |
125
- | 0.2911 | 269.23 | 17500 | 0.6000 |
126
- | 0.2911 | 273.08 | 17750 | 0.6025 |
127
- | 0.2909 | 276.92 | 18000 | 0.5917 |
128
- | 0.2909 | 280.77 | 18250 | 0.6016 |
129
- | 0.2875 | 284.62 | 18500 | 0.6151 |
130
- | 0.2875 | 288.46 | 18750 | 0.6035 |
131
- | 0.2866 | 292.31 | 19000 | 0.6019 |
132
- | 0.2866 | 296.15 | 19250 | 0.6014 |
133
- | 0.2821 | 300.0 | 19500 | 0.6029 |
134
- | 0.2821 | 303.85 | 19750 | 0.5953 |
135
- | 0.2814 | 307.69 | 20000 | 0.6202 |
136
- | 0.2814 | 311.54 | 20250 | 0.5953 |
137
- | 0.2798 | 315.38 | 20500 | 0.6153 |
138
- | 0.2798 | 319.23 | 20750 | 0.6232 |
139
- | 0.2766 | 323.08 | 21000 | 0.6175 |
140
- | 0.2766 | 326.92 | 21250 | 0.6162 |
141
- | 0.2755 | 330.77 | 21500 | 0.6047 |
142
- | 0.2755 | 334.62 | 21750 | 0.6052 |
143
- | 0.2742 | 338.46 | 22000 | 0.6138 |
144
- | 0.2742 | 342.31 | 22250 | 0.6225 |
145
- | 0.2746 | 346.15 | 22500 | 0.6015 |
146
- | 0.2746 | 350.0 | 22750 | 0.6029 |
147
- | 0.2716 | 353.85 | 23000 | 0.6105 |
148
- | 0.2716 | 357.69 | 23250 | 0.6132 |
149
- | 0.2697 | 361.54 | 23500 | 0.6129 |
150
- | 0.2697 | 365.38 | 23750 | 0.6045 |
151
- | 0.2704 | 369.23 | 24000 | 0.6155 |
152
- | 0.2704 | 373.08 | 24250 | 0.6075 |
153
- | 0.2694 | 376.92 | 24500 | 0.6154 |
154
- | 0.2694 | 380.77 | 24750 | 0.6263 |
155
- | 0.2672 | 384.62 | 25000 | 0.6181 |
156
- | 0.2672 | 388.46 | 25250 | 0.6185 |
157
- | 0.2649 | 392.31 | 25500 | 0.6131 |
158
- | 0.2649 | 396.15 | 25750 | 0.6113 |
159
- | 0.2641 | 400.0 | 26000 | 0.6151 |
160
- | 0.2641 | 403.85 | 26250 | 0.6219 |
161
- | 0.2642 | 407.69 | 26500 | 0.6228 |
162
- | 0.2642 | 411.54 | 26750 | 0.6258 |
163
- | 0.2621 | 415.38 | 27000 | 0.6161 |
164
- | 0.2621 | 419.23 | 27250 | 0.6316 |
165
- | 0.2634 | 423.08 | 27500 | 0.6159 |
166
- | 0.2634 | 426.92 | 27750 | 0.6192 |
167
- | 0.2611 | 430.77 | 28000 | 0.6210 |
168
- | 0.2611 | 434.62 | 28250 | 0.6246 |
169
- | 0.2593 | 438.46 | 28500 | 0.6142 |
170
- | 0.2593 | 442.31 | 28750 | 0.6157 |
171
- | 0.26 | 446.15 | 29000 | 0.6198 |
172
- | 0.26 | 450.0 | 29250 | 0.6182 |
173
- | 0.262 | 453.85 | 29500 | 0.6188 |
174
- | 0.262 | 457.69 | 29750 | 0.6223 |
175
- | 0.2616 | 461.54 | 30000 | 0.6228 |
176
 
177
 
178
  ### Framework versions
 
1
  ---
 
 
2
  license: mit
3
  base_model: microsoft/speecht5_tts
4
  tags:
 
5
  - generated_from_trainer
 
 
6
  model-index:
7
+ - name: speecht5_tts
8
  results: []
9
  ---
10
 
11
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
  should probably proofread and complete it, then remove this comment. -->
13
 
14
+ # speecht5_tts
15
 
16
+ This model is a fine-tuned version of [microsoft/speecht5_tts](https://huggingface.co/microsoft/speecht5_tts) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 0.5854
19
 
20
  ## Model description
21
 
 
41
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42
  - lr_scheduler_type: linear
43
  - lr_scheduler_warmup_steps: 500
44
+ - training_steps: 10000
45
  - mixed_precision_training: Native AMP
46
 
47
  ### Training results
48
 
49
+ | Training Loss | Epoch | Step | Validation Loss |
50
+ |:-------------:|:-----:|:-----:|:---------------:|
51
+ | No log | 1.41 | 250 | 0.5448 |
52
+ | 0.6715 | 2.82 | 500 | 0.5147 |
53
+ | 0.6715 | 4.24 | 750 | 0.5225 |
54
+ | 0.5532 | 5.65 | 1000 | 0.5096 |
55
+ | 0.5532 | 7.06 | 1250 | 0.5293 |
56
+ | 0.5156 | 8.47 | 1500 | 0.5310 |
57
+ | 0.5156 | 9.89 | 1750 | 0.5417 |
58
+ | 0.4874 | 11.3 | 2000 | 0.5185 |
59
+ | 0.4874 | 12.71 | 2250 | 0.5112 |
60
+ | 0.4693 | 14.12 | 2500 | 0.5154 |
61
+ | 0.4693 | 15.54 | 2750 | 0.5148 |
62
+ | 0.4619 | 16.95 | 3000 | 0.5367 |
63
+ | 0.4619 | 18.36 | 3250 | 0.5207 |
64
+ | 0.447 | 19.77 | 3500 | 0.5318 |
65
+ | 0.447 | 21.19 | 3750 | 0.5286 |
66
+ | 0.4348 | 22.6 | 4000 | 0.5345 |
67
+ | 0.4348 | 24.01 | 4250 | 0.5362 |
68
+ | 0.4237 | 25.42 | 4500 | 0.5568 |
69
+ | 0.4237 | 26.84 | 4750 | 0.5352 |
70
+ | 0.4195 | 28.25 | 5000 | 0.5395 |
71
+ | 0.4195 | 29.66 | 5250 | 0.5487 |
72
+ | 0.4132 | 31.07 | 5500 | 0.5443 |
73
+ | 0.4132 | 32.49 | 5750 | 0.5491 |
74
+ | 0.3975 | 33.9 | 6000 | 0.5465 |
75
+ | 0.3975 | 35.31 | 6250 | 0.5505 |
76
+ | 0.396 | 36.72 | 6500 | 0.5450 |
77
+ | 0.396 | 38.14 | 6750 | 0.5510 |
78
+ | 0.3884 | 39.55 | 7000 | 0.5517 |
79
+ | 0.3884 | 40.96 | 7250 | 0.5685 |
80
+ | 0.383 | 42.37 | 7500 | 0.5622 |
81
+ | 0.383 | 43.79 | 7750 | 0.5659 |
82
+ | 0.3806 | 45.2 | 8000 | 0.5636 |
83
+ | 0.3806 | 46.61 | 8250 | 0.5681 |
84
+ | 0.3738 | 48.02 | 8500 | 0.5797 |
85
+ | 0.3738 | 49.44 | 8750 | 0.5741 |
86
+ | 0.3705 | 50.85 | 9000 | 0.5765 |
87
+ | 0.3705 | 52.26 | 9250 | 0.5770 |
88
+ | 0.364 | 53.67 | 9500 | 0.5854 |
89
+ | 0.364 | 55.08 | 9750 | 0.5806 |
90
+ | 0.36 | 56.5 | 10000 | 0.5854 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
91
 
92
 
93
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:5d49d5b1c7ee4c1bbdd55f3c1c6b6150c3c9079450e26dc347ef8bcec6bef55a
3
  size 577789320
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:df7d8f67102c864965a399f02f105b4c7d672e085b4e866f6b086f6c418bb2de
3
  size 577789320
runs/Dec14_14-11-25_Threadripper/events.out.tfevents.1702581085.Threadripper ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f08e7b78fe4aa216256d552dd88eaa34bb9d19a7d22f575234b5c6d13927eef8
3
+ size 5829
runs/Dec14_14-11-56_Threadripper/events.out.tfevents.1702581116.Threadripper ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8a36a4b40196ff5a46deb80cc619077878ac5065a1d6fe6ecf8288767845e606
3
+ size 20163
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:00f1ed33672ce16ebfba284e24cdb224500498c8ae80d5a547566236dd268046
3
  size 4792
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:de1faca64e261be20c12e3cec812069c823b67b5330d7608ab32c7e69ec837eb
3
  size 4792