JBZhang2342 commited on
Commit
fe1b2c9
1 Parent(s): 16c0116

Model save

Browse files
README.md CHANGED
@@ -1,26 +1,21 @@
1
  ---
2
- language:
3
- - en
4
  license: mit
5
  base_model: microsoft/speecht5_tts
6
  tags:
7
- - en_accent,mozilla,t5,common_voice_1_0
8
  - generated_from_trainer
9
- datasets:
10
- - mozilla-foundation/common_voice_1_0
11
  model-index:
12
- - name: SpeechT5 TTS English Accented
13
  results: []
14
  ---
15
 
16
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
17
  should probably proofread and complete it, then remove this comment. -->
18
 
19
- # SpeechT5 TTS English Accented
20
 
21
- This model is a fine-tuned version of [microsoft/speecht5_tts](https://huggingface.co/microsoft/speecht5_tts) on the Common Voice dataset.
22
  It achieves the following results on the evaluation set:
23
- - Loss: 0.7806
24
 
25
  ## Model description
26
 
@@ -51,128 +46,128 @@ The following hyperparameters were used during training:
51
 
52
  ### Training results
53
 
54
- | Training Loss | Epoch | Step | Validation Loss |
55
- |:-------------:|:-----:|:-----:|:---------------:|
56
- | No log | 0.53 | 250 | 0.8506 |
57
- | 1.0736 | 1.06 | 500 | 0.8219 |
58
- | 1.0736 | 1.6 | 750 | 0.7713 |
59
- | 0.8607 | 2.13 | 1000 | 0.7947 |
60
- | 0.8607 | 2.66 | 1250 | 0.7537 |
61
- | 0.802 | 3.19 | 1500 | 0.7304 |
62
- | 0.802 | 3.72 | 1750 | 0.7409 |
63
- | 0.7627 | 4.26 | 2000 | 0.7282 |
64
- | 0.7627 | 4.79 | 2250 | 0.7224 |
65
- | 0.7442 | 5.32 | 2500 | 0.7132 |
66
- | 0.7442 | 5.85 | 2750 | 0.7718 |
67
- | 0.736 | 6.38 | 3000 | 0.7362 |
68
- | 0.736 | 6.91 | 3250 | 0.7283 |
69
- | 0.7234 | 7.45 | 3500 | 0.7377 |
70
- | 0.7234 | 7.98 | 3750 | 0.7226 |
71
- | 0.6968 | 8.51 | 4000 | 0.7285 |
72
- | 0.6968 | 9.04 | 4250 | 0.7395 |
73
- | 0.692 | 9.57 | 4500 | 0.7306 |
74
- | 0.692 | 10.11 | 4750 | 0.7221 |
75
- | 0.6807 | 10.64 | 5000 | 0.7349 |
76
- | 0.6807 | 11.17 | 5250 | 0.7310 |
77
- | 0.6702 | 11.7 | 5500 | 0.7391 |
78
- | 0.6702 | 12.23 | 5750 | 0.7299 |
79
- | 0.6559 | 12.77 | 6000 | 0.7277 |
80
- | 0.6559 | 13.3 | 6250 | 0.7453 |
81
- | 0.6511 | 13.83 | 6500 | 0.7303 |
82
- | 0.6511 | 14.36 | 6750 | 0.7451 |
83
- | 0.6335 | 14.89 | 7000 | 0.7209 |
84
- | 0.6335 | 15.43 | 7250 | 0.7421 |
85
- | 0.6282 | 15.96 | 7500 | 0.7277 |
86
- | 0.6282 | 16.49 | 7750 | 0.7426 |
87
- | 0.6286 | 17.02 | 8000 | 0.7724 |
88
- | 0.6286 | 17.55 | 8250 | 0.7310 |
89
- | 0.6164 | 18.09 | 8500 | 0.7414 |
90
- | 0.6164 | 18.62 | 8750 | 0.7411 |
91
- | 0.6029 | 19.15 | 9000 | 0.7466 |
92
- | 0.6029 | 19.68 | 9250 | 0.7267 |
93
- | 0.5986 | 20.21 | 9500 | 0.7593 |
94
- | 0.5986 | 20.74 | 9750 | 0.7544 |
95
- | 0.595 | 21.28 | 10000 | 0.7441 |
96
- | 0.595 | 21.81 | 10250 | 0.7422 |
97
- | 0.5905 | 22.34 | 10500 | 0.7399 |
98
- | 0.5905 | 22.87 | 10750 | 0.7494 |
99
- | 0.5792 | 23.4 | 11000 | 0.7311 |
100
- | 0.5792 | 23.94 | 11250 | 0.7479 |
101
- | 0.5774 | 24.47 | 11500 | 0.7615 |
102
- | 0.5774 | 25.0 | 11750 | 0.7578 |
103
- | 0.5684 | 25.53 | 12000 | 0.7603 |
104
- | 0.5684 | 26.06 | 12250 | 0.7300 |
105
- | 0.5621 | 26.6 | 12500 | 0.7385 |
106
- | 0.5621 | 27.13 | 12750 | 0.7447 |
107
- | 0.5666 | 27.66 | 13000 | 0.7400 |
108
- | 0.5666 | 28.19 | 13250 | 0.7518 |
109
- | 0.5525 | 28.72 | 13500 | 0.7462 |
110
- | 0.5525 | 29.26 | 13750 | 0.7351 |
111
- | 0.5471 | 29.79 | 14000 | 0.7673 |
112
- | 0.5471 | 30.32 | 14250 | 0.7325 |
113
- | 0.5449 | 30.85 | 14500 | 0.7455 |
114
- | 0.5449 | 31.38 | 14750 | 0.7473 |
115
- | 0.5349 | 31.91 | 15000 | 0.7549 |
116
- | 0.5349 | 32.45 | 15250 | 0.7513 |
117
- | 0.5345 | 32.98 | 15500 | 0.7472 |
118
- | 0.5345 | 33.51 | 15750 | 0.7542 |
119
- | 0.5285 | 34.04 | 16000 | 0.7513 |
120
- | 0.5285 | 34.57 | 16250 | 0.7466 |
121
- | 0.522 | 35.11 | 16500 | 0.7627 |
122
- | 0.522 | 35.64 | 16750 | 0.7609 |
123
- | 0.5209 | 36.17 | 17000 | 0.7616 |
124
- | 0.5209 | 36.7 | 17250 | 0.7612 |
125
- | 0.5151 | 37.23 | 17500 | 0.7601 |
126
- | 0.5151 | 37.77 | 17750 | 0.7590 |
127
- | 0.5088 | 38.3 | 18000 | 0.7568 |
128
- | 0.5088 | 38.83 | 18250 | 0.7551 |
129
- | 0.5105 | 39.36 | 18500 | 0.7688 |
130
- | 0.5105 | 39.89 | 18750 | 0.7631 |
131
- | 0.5046 | 40.43 | 19000 | 0.7654 |
132
- | 0.5046 | 40.96 | 19250 | 0.7749 |
133
- | 0.5029 | 41.49 | 19500 | 0.7617 |
134
- | 0.5029 | 42.02 | 19750 | 0.7735 |
135
- | 0.4969 | 42.55 | 20000 | 0.7763 |
136
- | 0.4969 | 43.09 | 20250 | 0.7484 |
137
- | 0.497 | 43.62 | 20500 | 0.7606 |
138
- | 0.497 | 44.15 | 20750 | 0.7726 |
139
- | 0.4889 | 44.68 | 21000 | 0.7564 |
140
- | 0.4889 | 45.21 | 21250 | 0.7694 |
141
- | 0.4842 | 45.74 | 21500 | 0.7639 |
142
- | 0.4842 | 46.28 | 21750 | 0.7784 |
143
- | 0.4829 | 46.81 | 22000 | 0.7817 |
144
- | 0.4829 | 47.34 | 22250 | 0.7727 |
145
- | 0.4772 | 47.87 | 22500 | 0.7661 |
146
- | 0.4772 | 48.4 | 22750 | 0.7630 |
147
- | 0.477 | 48.94 | 23000 | 0.7640 |
148
- | 0.477 | 49.47 | 23250 | 0.7730 |
149
- | 0.4766 | 50.0 | 23500 | 0.7708 |
150
- | 0.4766 | 50.53 | 23750 | 0.7716 |
151
- | 0.4717 | 51.06 | 24000 | 0.7670 |
152
- | 0.4717 | 51.6 | 24250 | 0.7671 |
153
- | 0.4686 | 52.13 | 24500 | 0.7711 |
154
- | 0.4686 | 52.66 | 24750 | 0.7704 |
155
- | 0.4685 | 53.19 | 25000 | 0.7775 |
156
- | 0.4685 | 53.72 | 25250 | 0.7690 |
157
- | 0.4635 | 54.26 | 25500 | 0.7839 |
158
- | 0.4635 | 54.79 | 25750 | 0.7746 |
159
- | 0.4617 | 55.32 | 26000 | 0.7738 |
160
- | 0.4617 | 55.85 | 26250 | 0.7753 |
161
- | 0.4549 | 56.38 | 26500 | 0.7830 |
162
- | 0.4549 | 56.91 | 26750 | 0.7777 |
163
- | 0.4564 | 57.45 | 27000 | 0.7758 |
164
- | 0.4564 | 57.98 | 27250 | 0.7728 |
165
- | 0.4546 | 58.51 | 27500 | 0.7772 |
166
- | 0.4546 | 59.04 | 27750 | 0.7795 |
167
- | 0.4511 | 59.57 | 28000 | 0.7754 |
168
- | 0.4511 | 60.11 | 28250 | 0.7867 |
169
- | 0.4467 | 60.64 | 28500 | 0.7838 |
170
- | 0.4467 | 61.17 | 28750 | 0.7858 |
171
- | 0.4512 | 61.7 | 29000 | 0.7758 |
172
- | 0.4512 | 62.23 | 29250 | 0.7819 |
173
- | 0.4497 | 62.77 | 29500 | 0.7871 |
174
- | 0.4497 | 63.3 | 29750 | 0.7817 |
175
- | 0.4463 | 63.83 | 30000 | 0.7806 |
176
 
177
 
178
  ### Framework versions
 
1
  ---
 
 
2
  license: mit
3
  base_model: microsoft/speecht5_tts
4
  tags:
 
5
  - generated_from_trainer
 
 
6
  model-index:
7
+ - name: speecht5_tts
8
  results: []
9
  ---
10
 
11
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
  should probably proofread and complete it, then remove this comment. -->
13
 
14
+ # speecht5_tts
15
 
16
+ This model is a fine-tuned version of [microsoft/speecht5_tts](https://huggingface.co/microsoft/speecht5_tts) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 0.4510
19
 
20
  ## Model description
21
 
 
46
 
47
  ### Training results
48
 
49
+ | Training Loss | Epoch | Step | Validation Loss |
50
+ |:-------------:|:------:|:-----:|:---------------:|
51
+ | No log | 2.34 | 250 | 0.4040 |
52
+ | 0.4941 | 4.67 | 500 | 0.3757 |
53
+ | 0.4941 | 7.01 | 750 | 0.3897 |
54
+ | 0.4047 | 9.35 | 1000 | 0.3722 |
55
+ | 0.4047 | 11.68 | 1250 | 0.3740 |
56
+ | 0.3766 | 14.02 | 1500 | 0.3743 |
57
+ | 0.3766 | 16.36 | 1750 | 0.3888 |
58
+ | 0.3689 | 18.69 | 2000 | 0.3749 |
59
+ | 0.3689 | 21.03 | 2250 | 0.3867 |
60
+ | 0.349 | 23.36 | 2500 | 0.3795 |
61
+ | 0.349 | 25.7 | 2750 | 0.3936 |
62
+ | 0.3424 | 28.04 | 3000 | 0.3786 |
63
+ | 0.3424 | 30.37 | 3250 | 0.3725 |
64
+ | 0.3394 | 32.71 | 3500 | 0.3869 |
65
+ | 0.3394 | 35.05 | 3750 | 0.3978 |
66
+ | 0.3333 | 37.38 | 4000 | 0.3944 |
67
+ | 0.3333 | 39.72 | 4250 | 0.4143 |
68
+ | 0.3293 | 42.06 | 4500 | 0.3949 |
69
+ | 0.3293 | 44.39 | 4750 | 0.3739 |
70
+ | 0.3275 | 46.73 | 5000 | 0.3886 |
71
+ | 0.3275 | 49.07 | 5250 | 0.3900 |
72
+ | 0.3219 | 51.4 | 5500 | 0.3955 |
73
+ | 0.3219 | 53.74 | 5750 | 0.4065 |
74
+ | 0.3147 | 56.07 | 6000 | 0.4022 |
75
+ | 0.3147 | 58.41 | 6250 | 0.3874 |
76
+ | 0.3153 | 60.75 | 6500 | 0.3999 |
77
+ | 0.3153 | 63.08 | 6750 | 0.3880 |
78
+ | 0.3084 | 65.42 | 7000 | 0.3874 |
79
+ | 0.3084 | 67.76 | 7250 | 0.4324 |
80
+ | 0.3067 | 70.09 | 7500 | 0.4177 |
81
+ | 0.3067 | 72.43 | 7750 | 0.4054 |
82
+ | 0.3044 | 74.77 | 8000 | 0.4039 |
83
+ | 0.3044 | 77.1 | 8250 | 0.4080 |
84
+ | 0.3001 | 79.44 | 8500 | 0.3963 |
85
+ | 0.3001 | 81.78 | 8750 | 0.4182 |
86
+ | 0.2965 | 84.11 | 9000 | 0.4026 |
87
+ | 0.2965 | 86.45 | 9250 | 0.4261 |
88
+ | 0.2939 | 88.79 | 9500 | 0.4102 |
89
+ | 0.2939 | 91.12 | 9750 | 0.4187 |
90
+ | 0.2903 | 93.46 | 10000 | 0.4052 |
91
+ | 0.2903 | 95.79 | 10250 | 0.4185 |
92
+ | 0.2904 | 98.13 | 10500 | 0.4092 |
93
+ | 0.2904 | 100.47 | 10750 | 0.4182 |
94
+ | 0.285 | 102.8 | 11000 | 0.4127 |
95
+ | 0.285 | 105.14 | 11250 | 0.4231 |
96
+ | 0.2859 | 107.48 | 11500 | 0.4053 |
97
+ | 0.2859 | 109.81 | 11750 | 0.4249 |
98
+ | 0.2824 | 112.15 | 12000 | 0.4086 |
99
+ | 0.2824 | 114.49 | 12250 | 0.4232 |
100
+ | 0.2794 | 116.82 | 12500 | 0.4210 |
101
+ | 0.2794 | 119.16 | 12750 | 0.4295 |
102
+ | 0.2803 | 121.5 | 13000 | 0.4412 |
103
+ | 0.2803 | 123.83 | 13250 | 0.4201 |
104
+ | 0.277 | 126.17 | 13500 | 0.4181 |
105
+ | 0.277 | 128.5 | 13750 | 0.4179 |
106
+ | 0.2744 | 130.84 | 14000 | 0.4257 |
107
+ | 0.2744 | 133.18 | 14250 | 0.4396 |
108
+ | 0.2744 | 135.51 | 14500 | 0.4265 |
109
+ | 0.2744 | 137.85 | 14750 | 0.4198 |
110
+ | 0.2702 | 140.19 | 15000 | 0.4414 |
111
+ | 0.2702 | 142.52 | 15250 | 0.4304 |
112
+ | 0.2661 | 144.86 | 15500 | 0.4444 |
113
+ | 0.2661 | 147.2 | 15750 | 0.4228 |
114
+ | 0.2649 | 149.53 | 16000 | 0.4412 |
115
+ | 0.2649 | 151.87 | 16250 | 0.4269 |
116
+ | 0.264 | 154.21 | 16500 | 0.4343 |
117
+ | 0.264 | 156.54 | 16750 | 0.4240 |
118
+ | 0.2616 | 158.88 | 17000 | 0.4350 |
119
+ | 0.2616 | 161.21 | 17250 | 0.4430 |
120
+ | 0.2611 | 163.55 | 17500 | 0.4545 |
121
+ | 0.2611 | 165.89 | 17750 | 0.4512 |
122
+ | 0.2601 | 168.22 | 18000 | 0.4569 |
123
+ | 0.2601 | 170.56 | 18250 | 0.4263 |
124
+ | 0.2596 | 172.9 | 18500 | 0.4336 |
125
+ | 0.2596 | 175.23 | 18750 | 0.4464 |
126
+ | 0.2564 | 177.57 | 19000 | 0.4546 |
127
+ | 0.2564 | 179.91 | 19250 | 0.4513 |
128
+ | 0.2554 | 182.24 | 19500 | 0.4349 |
129
+ | 0.2554 | 184.58 | 19750 | 0.4360 |
130
+ | 0.255 | 186.92 | 20000 | 0.4571 |
131
+ | 0.255 | 189.25 | 20250 | 0.4411 |
132
+ | 0.2525 | 191.59 | 20500 | 0.4435 |
133
+ | 0.2525 | 193.93 | 20750 | 0.4325 |
134
+ | 0.251 | 196.26 | 21000 | 0.4441 |
135
+ | 0.251 | 198.6 | 21250 | 0.4331 |
136
+ | 0.2505 | 200.93 | 21500 | 0.4484 |
137
+ | 0.2505 | 203.27 | 21750 | 0.4418 |
138
+ | 0.2517 | 205.61 | 22000 | 0.4473 |
139
+ | 0.2517 | 207.94 | 22250 | 0.4519 |
140
+ | 0.2477 | 210.28 | 22500 | 0.4428 |
141
+ | 0.2477 | 212.62 | 22750 | 0.4464 |
142
+ | 0.2468 | 214.95 | 23000 | 0.4387 |
143
+ | 0.2468 | 217.29 | 23250 | 0.4600 |
144
+ | 0.2467 | 219.63 | 23500 | 0.4404 |
145
+ | 0.2467 | 221.96 | 23750 | 0.4586 |
146
+ | 0.2469 | 224.3 | 24000 | 0.4449 |
147
+ | 0.2469 | 226.64 | 24250 | 0.4521 |
148
+ | 0.2445 | 228.97 | 24500 | 0.4480 |
149
+ | 0.2445 | 231.31 | 24750 | 0.4586 |
150
+ | 0.2438 | 233.64 | 25000 | 0.4529 |
151
+ | 0.2438 | 235.98 | 25250 | 0.4515 |
152
+ | 0.2413 | 238.32 | 25500 | 0.4570 |
153
+ | 0.2413 | 240.65 | 25750 | 0.4486 |
154
+ | 0.2449 | 242.99 | 26000 | 0.4490 |
155
+ | 0.2449 | 245.33 | 26250 | 0.4479 |
156
+ | 0.2412 | 247.66 | 26500 | 0.4509 |
157
+ | 0.2412 | 250.0 | 26750 | 0.4472 |
158
+ | 0.2417 | 252.34 | 27000 | 0.4444 |
159
+ | 0.2417 | 254.67 | 27250 | 0.4477 |
160
+ | 0.2407 | 257.01 | 27500 | 0.4494 |
161
+ | 0.2407 | 259.35 | 27750 | 0.4530 |
162
+ | 0.2397 | 261.68 | 28000 | 0.4474 |
163
+ | 0.2397 | 264.02 | 28250 | 0.4484 |
164
+ | 0.2397 | 266.36 | 28500 | 0.4512 |
165
+ | 0.2397 | 268.69 | 28750 | 0.4523 |
166
+ | 0.2397 | 271.03 | 29000 | 0.4451 |
167
+ | 0.2397 | 273.36 | 29250 | 0.4476 |
168
+ | 0.241 | 275.7 | 29500 | 0.4515 |
169
+ | 0.241 | 278.04 | 29750 | 0.4514 |
170
+ | 0.2395 | 280.37 | 30000 | 0.4510 |
171
 
172
 
173
  ### Framework versions
runs/Dec10_17-23-42_Threadripper/events.out.tfevents.1702247022.Threadripper CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:6526935b2d420389c7856bf66fb824d657605ebab723b4bd0e9eb3070fe345cb
3
- size 33612
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0f667cd38d7008ac6085a0b3d559068580a7224a7b73aa90d32db3fb736f1c7a
3
+ size 48488