JBZhang2342 commited on
Commit
fd36feb
1 Parent(s): 69bff62

Model save

Browse files
README.md CHANGED
@@ -1,26 +1,21 @@
1
  ---
2
- language:
3
- - en
4
  license: mit
5
  base_model: microsoft/speecht5_tts
6
  tags:
7
- - en_accent,mozilla,t5,common_voice_1_0
8
  - generated_from_trainer
9
- datasets:
10
- - mozilla-foundation/common_voice_1_0
11
  model-index:
12
- - name: SpeechT5 TTS English Accented
13
  results: []
14
  ---
15
 
16
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
17
  should probably proofread and complete it, then remove this comment. -->
18
 
19
- # SpeechT5 TTS English Accented
20
 
21
- This model is a fine-tuned version of [microsoft/speecht5_tts](https://huggingface.co/microsoft/speecht5_tts) on the Common Voice dataset.
22
  It achieves the following results on the evaluation set:
23
- - Loss: 0.3797
24
 
25
  ## Model description
26
 
@@ -46,53 +41,73 @@ The following hyperparameters were used during training:
46
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
  - lr_scheduler_type: linear
48
  - lr_scheduler_warmup_steps: 500
49
- - training_steps: 10000
50
  - mixed_precision_training: Native AMP
51
 
52
  ### Training results
53
 
54
- | Training Loss | Epoch | Step | Validation Loss |
55
- |:-------------:|:-----:|:-----:|:---------------:|
56
- | No log | 2.34 | 250 | 0.5177 |
57
- | 0.587 | 4.67 | 500 | 0.3719 |
58
- | 0.587 | 7.01 | 750 | 0.3577 |
59
- | 0.4026 | 9.35 | 1000 | 0.3580 |
60
- | 0.4026 | 11.68 | 1250 | 0.3634 |
61
- | 0.3733 | 14.02 | 1500 | 0.3597 |
62
- | 0.3733 | 16.36 | 1750 | 0.3575 |
63
- | 0.3643 | 18.69 | 2000 | 0.3580 |
64
- | 0.3643 | 21.03 | 2250 | 0.3562 |
65
- | 0.35 | 23.36 | 2500 | 0.3566 |
66
- | 0.35 | 25.7 | 2750 | 0.3630 |
67
- | 0.3433 | 28.04 | 3000 | 0.3551 |
68
- | 0.3433 | 30.37 | 3250 | 0.3601 |
69
- | 0.3412 | 32.71 | 3500 | 0.3648 |
70
- | 0.3412 | 35.05 | 3750 | 0.3667 |
71
- | 0.3377 | 37.38 | 4000 | 0.3716 |
72
- | 0.3377 | 39.72 | 4250 | 0.3759 |
73
- | 0.333 | 42.06 | 4500 | 0.3709 |
74
- | 0.333 | 44.39 | 4750 | 0.3707 |
75
- | 0.3319 | 46.73 | 5000 | 0.3722 |
76
- | 0.3319 | 49.07 | 5250 | 0.3678 |
77
- | 0.328 | 51.4 | 5500 | 0.3653 |
78
- | 0.328 | 53.74 | 5750 | 0.3831 |
79
- | 0.3235 | 56.07 | 6000 | 0.3731 |
80
- | 0.3235 | 58.41 | 6250 | 0.3809 |
81
- | 0.3241 | 60.75 | 6500 | 0.3791 |
82
- | 0.3241 | 63.08 | 6750 | 0.3799 |
83
- | 0.3197 | 65.42 | 7000 | 0.3759 |
84
- | 0.3197 | 67.76 | 7250 | 0.3734 |
85
- | 0.3202 | 70.09 | 7500 | 0.3717 |
86
- | 0.3202 | 72.43 | 7750 | 0.3811 |
87
- | 0.3189 | 74.77 | 8000 | 0.3821 |
88
- | 0.3189 | 77.1 | 8250 | 0.3884 |
89
- | 0.3177 | 79.44 | 8500 | 0.3811 |
90
- | 0.3177 | 81.78 | 8750 | 0.3729 |
91
- | 0.3162 | 84.11 | 9000 | 0.3803 |
92
- | 0.3162 | 86.45 | 9250 | 0.3835 |
93
- | 0.3154 | 88.79 | 9500 | 0.3845 |
94
- | 0.3154 | 91.12 | 9750 | 0.3815 |
95
- | 0.3143 | 93.46 | 10000 | 0.3797 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
96
 
97
 
98
  ### Framework versions
 
1
  ---
 
 
2
  license: mit
3
  base_model: microsoft/speecht5_tts
4
  tags:
 
5
  - generated_from_trainer
 
 
6
  model-index:
7
+ - name: speecht5_tts
8
  results: []
9
  ---
10
 
11
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
  should probably proofread and complete it, then remove this comment. -->
13
 
14
+ # speecht5_tts
15
 
16
+ This model is a fine-tuned version of [microsoft/speecht5_tts](https://huggingface.co/microsoft/speecht5_tts) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 0.3812
19
 
20
  ## Model description
21
 
 
41
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42
  - lr_scheduler_type: linear
43
  - lr_scheduler_warmup_steps: 500
44
+ - training_steps: 15000
45
  - mixed_precision_training: Native AMP
46
 
47
  ### Training results
48
 
49
+ | Training Loss | Epoch | Step | Validation Loss |
50
+ |:-------------:|:------:|:-----:|:---------------:|
51
+ | No log | 2.34 | 250 | 0.5148 |
52
+ | 0.5893 | 4.67 | 500 | 0.3786 |
53
+ | 0.5893 | 7.01 | 750 | 0.3621 |
54
+ | 0.4015 | 9.35 | 1000 | 0.3600 |
55
+ | 0.4015 | 11.68 | 1250 | 0.3701 |
56
+ | 0.3739 | 14.02 | 1500 | 0.3612 |
57
+ | 0.3739 | 16.36 | 1750 | 0.3626 |
58
+ | 0.3634 | 18.69 | 2000 | 0.3499 |
59
+ | 0.3634 | 21.03 | 2250 | 0.3549 |
60
+ | 0.3499 | 23.36 | 2500 | 0.3600 |
61
+ | 0.3499 | 25.7 | 2750 | 0.3533 |
62
+ | 0.3428 | 28.04 | 3000 | 0.3652 |
63
+ | 0.3428 | 30.37 | 3250 | 0.3541 |
64
+ | 0.3407 | 32.71 | 3500 | 0.3579 |
65
+ | 0.3407 | 35.05 | 3750 | 0.3550 |
66
+ | 0.3368 | 37.38 | 4000 | 0.3624 |
67
+ | 0.3368 | 39.72 | 4250 | 0.3621 |
68
+ | 0.3315 | 42.06 | 4500 | 0.3577 |
69
+ | 0.3315 | 44.39 | 4750 | 0.3620 |
70
+ | 0.3305 | 46.73 | 5000 | 0.3665 |
71
+ | 0.3305 | 49.07 | 5250 | 0.3641 |
72
+ | 0.3273 | 51.4 | 5500 | 0.3563 |
73
+ | 0.3273 | 53.74 | 5750 | 0.3579 |
74
+ | 0.3228 | 56.07 | 6000 | 0.3615 |
75
+ | 0.3228 | 58.41 | 6250 | 0.3606 |
76
+ | 0.3227 | 60.75 | 6500 | 0.3647 |
77
+ | 0.3227 | 63.08 | 6750 | 0.3647 |
78
+ | 0.3183 | 65.42 | 7000 | 0.3619 |
79
+ | 0.3183 | 67.76 | 7250 | 0.3786 |
80
+ | 0.3184 | 70.09 | 7500 | 0.3731 |
81
+ | 0.3184 | 72.43 | 7750 | 0.3630 |
82
+ | 0.3177 | 74.77 | 8000 | 0.3647 |
83
+ | 0.3177 | 77.1 | 8250 | 0.3668 |
84
+ | 0.3159 | 79.44 | 8500 | 0.3624 |
85
+ | 0.3159 | 81.78 | 8750 | 0.3742 |
86
+ | 0.3129 | 84.11 | 9000 | 0.3722 |
87
+ | 0.3129 | 86.45 | 9250 | 0.3755 |
88
+ | 0.3124 | 88.79 | 9500 | 0.3693 |
89
+ | 0.3124 | 91.12 | 9750 | 0.3707 |
90
+ | 0.3094 | 93.46 | 10000 | 0.3808 |
91
+ | 0.3094 | 95.79 | 10250 | 0.3696 |
92
+ | 0.3116 | 98.13 | 10500 | 0.3773 |
93
+ | 0.3116 | 100.47 | 10750 | 0.3796 |
94
+ | 0.3076 | 102.8 | 11000 | 0.3705 |
95
+ | 0.3076 | 105.14 | 11250 | 0.3718 |
96
+ | 0.3104 | 107.48 | 11500 | 0.3792 |
97
+ | 0.3104 | 109.81 | 11750 | 0.3714 |
98
+ | 0.3078 | 112.15 | 12000 | 0.3765 |
99
+ | 0.3078 | 114.49 | 12250 | 0.3803 |
100
+ | 0.3064 | 116.82 | 12500 | 0.3792 |
101
+ | 0.3064 | 119.16 | 12750 | 0.3803 |
102
+ | 0.3087 | 121.5 | 13000 | 0.3806 |
103
+ | 0.3087 | 123.83 | 13250 | 0.3821 |
104
+ | 0.3064 | 126.17 | 13500 | 0.3795 |
105
+ | 0.3064 | 128.5 | 13750 | 0.3766 |
106
+ | 0.3066 | 130.84 | 14000 | 0.3780 |
107
+ | 0.3066 | 133.18 | 14250 | 0.3858 |
108
+ | 0.3081 | 135.51 | 14500 | 0.3812 |
109
+ | 0.3081 | 137.85 | 14750 | 0.3829 |
110
+ | 0.3064 | 140.19 | 15000 | 0.3812 |
111
 
112
 
113
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:1dd52aa8323f242b94668a8a577835caa25e9bc9a798a1ab4c97153a579cdbc7
3
  size 577789320
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9a05b02eb367ed368eb20c866fb94be2cbf2641c9ac48a3d8d8954cebddb090a
3
  size 577789320
runs/Dec09_21-48-39_Threadripper/events.out.tfevents.1702176519.Threadripper ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3b8de289d48e356d2076fc6148eaebef37e1e0c94f483e8f600f874f8079d431
3
+ size 27152
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:d3f6b2def9ba577d3825f85e8b5eecada59a187006f1a7157caed8116b45a86c
3
  size 4792
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bb4575c6e840126acd570f7c9adc2f23151f449b29dcdf9fc86ddf2ebd9fda2c
3
  size 4792