mikhail-panzo commited on
Commit
a08dd46
1 Parent(s): 060d55f

End of training

Browse files
Files changed (1) hide show
  1. README.md +11 -6
README.md CHANGED
@@ -15,7 +15,7 @@ should probably proofread and complete it, then remove this comment. -->
15
 
16
  This model is a fine-tuned version of [microsoft/speecht5_tts](https://huggingface.co/microsoft/speecht5_tts) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
- - Loss: 1.0584
19
 
20
  ## Model description
21
 
@@ -36,19 +36,24 @@ More information needed
36
  The following hyperparameters were used during training:
37
  - learning_rate: 1e-05
38
  - train_batch_size: 4
39
- - eval_batch_size: 3
40
  - seed: 42
 
 
41
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42
  - lr_scheduler_type: linear
43
- - num_epochs: 3.0
 
 
44
 
45
  ### Training results
46
 
47
  | Training Loss | Epoch | Step | Validation Loss |
48
  |:-------------:|:-----:|:----:|:---------------:|
49
- | 1.8403 | 0.83 | 500 | 1.2271 |
50
- | 1.2393 | 1.65 | 1000 | 1.1105 |
51
- | 1.1409 | 2.48 | 1500 | 1.0584 |
 
52
 
53
 
54
  ### Framework versions
 
15
 
16
  This model is a fine-tuned version of [microsoft/speecht5_tts](https://huggingface.co/microsoft/speecht5_tts) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 0.7314
19
 
20
  ## Model description
21
 
 
36
  The following hyperparameters were used during training:
37
  - learning_rate: 1e-05
38
  - train_batch_size: 4
39
+ - eval_batch_size: 2
40
  - seed: 42
41
+ - gradient_accumulation_steps: 8
42
+ - total_train_batch_size: 32
43
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
44
  - lr_scheduler_type: linear
45
+ - lr_scheduler_warmup_steps: 500
46
+ - training_steps: 4000
47
+ - mixed_precision_training: Native AMP
48
 
49
  ### Training results
50
 
51
  | Training Loss | Epoch | Step | Validation Loss |
52
  |:-------------:|:-----:|:----:|:---------------:|
53
+ | 0.9147 | 13.22 | 1000 | 0.8691 |
54
+ | 0.812 | 26.45 | 2000 | 0.7710 |
55
+ | 0.7837 | 39.67 | 3000 | 0.7394 |
56
+ | 0.7648 | 52.89 | 4000 | 0.7314 |
57
 
58
 
59
  ### Framework versions