pravin96 commited on
Commit
002208c
1 Parent(s): ffe7129

Model save

Browse files
Files changed (1) hide show
  1. README.md +6 -8
README.md CHANGED
@@ -4,8 +4,6 @@ license: mit
4
  base_model: distil-whisper/distil-small.en
5
  tags:
6
  - generated_from_trainer
7
- datasets:
8
- - generator
9
  model-index:
10
  - name: distil_whisper_en
11
  results: []
@@ -16,7 +14,7 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  # distil_whisper_en
18
 
19
- This model is a fine-tuned version of [distil-whisper/distil-small.en](https://huggingface.co/distil-whisper/distil-small.en) on the generator dataset.
20
 
21
  ## Model description
22
 
@@ -36,15 +34,15 @@ More information needed
36
 
37
  The following hyperparameters were used during training:
38
  - learning_rate: 0.0001
39
- - train_batch_size: 2
40
  - eval_batch_size: 8
41
  - seed: 42
42
- - gradient_accumulation_steps: 2
43
- - total_train_batch_size: 4
44
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
  - lr_scheduler_type: linear
46
- - lr_scheduler_warmup_steps: 100
47
- - training_steps: 100
48
  - mixed_precision_training: Native AMP
49
 
50
  ### Training results
 
4
  base_model: distil-whisper/distil-small.en
5
  tags:
6
  - generated_from_trainer
 
 
7
  model-index:
8
  - name: distil_whisper_en
9
  results: []
 
14
 
15
  # distil_whisper_en
16
 
17
+ This model is a fine-tuned version of [distil-whisper/distil-small.en](https://huggingface.co/distil-whisper/distil-small.en) on the None dataset.
18
 
19
  ## Model description
20
 
 
34
 
35
  The following hyperparameters were used during training:
36
  - learning_rate: 0.0001
37
+ - train_batch_size: 8
38
  - eval_batch_size: 8
39
  - seed: 42
40
+ - gradient_accumulation_steps: 4
41
+ - total_train_batch_size: 32
42
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
43
  - lr_scheduler_type: linear
44
+ - lr_scheduler_warmup_steps: 800
45
+ - training_steps: 350
46
  - mixed_precision_training: Native AMP
47
 
48
  ### Training results