KGSAGAR commited on
Commit
171d1ba
·
1 Parent(s): cc9c1e3

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -3
README.md CHANGED
@@ -3,6 +3,7 @@ license: mit
3
  base_model: KGSAGAR/speecht5_finetuned_voxpopuli_es
4
  tags:
5
  - generated_from_trainer
 
6
  datasets:
7
  - voxpopuli
8
  model-index:
@@ -25,11 +26,11 @@ More information needed
25
 
26
  ## Intended uses & limitations
27
 
28
- More information needed
29
 
30
  ## Training and evaluation data
31
 
32
- More information needed
33
 
34
  ## Training procedure
35
 
@@ -63,4 +64,4 @@ The following hyperparameters were used during training:
63
  - Transformers 4.32.0
64
  - Pytorch 2.0.1+cu118
65
  - Datasets 2.14.4
66
- - Tokenizers 0.13.3
 
3
  base_model: KGSAGAR/speecht5_finetuned_voxpopuli_es
4
  tags:
5
  - generated_from_trainer
6
+ - text-to-speech model
7
  datasets:
8
  - voxpopuli
9
  model-index:
 
26
 
27
  ## Intended uses & limitations
28
 
29
+ (https://colab.research.google.com/drive/1NG4uTeW97wYRdzkjfWI4gONntxg9Y17S?usp=sharing)__Increase the training steps prameter and further train the model to improve model performance
30
 
31
  ## Training and evaluation data
32
 
33
+ TrainOutput(global_step=25, training_loss=0.6927243232727051, metrics={'train_runtime': 8513.8366, 'train_samples_per_second': 0.094, 'train_steps_per_second': 0.003, 'total_flos': 116396101622592.0, 'train_loss': 0.6927243232727051, 'epoch': 0.11})
34
 
35
  ## Training procedure
36
 
 
64
  - Transformers 4.32.0
65
  - Pytorch 2.0.1+cu118
66
  - Datasets 2.14.4
67
+ - Tokenizers 0.13.3