Nadilazev commited on
Commit
3bed670
1 Parent(s): bdaa04f

finetuning-zephyr-pt

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -33,7 +33,7 @@ More information needed
33
 
34
  The following hyperparameters were used during training:
35
  - learning_rate: 0.0002
36
- - train_batch_size: 8
37
  - eval_batch_size: 8
38
  - seed: 42
39
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
@@ -48,6 +48,6 @@ The following hyperparameters were used during training:
48
  ### Framework versions
49
 
50
  - Transformers 4.35.0
51
- - Pytorch 2.1.0+cu118
52
  - Datasets 2.14.6
53
  - Tokenizers 0.14.1
 
33
 
34
  The following hyperparameters were used during training:
35
  - learning_rate: 0.0002
36
+ - train_batch_size: 16
37
  - eval_batch_size: 8
38
  - seed: 42
39
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
 
48
  ### Framework versions
49
 
50
  - Transformers 4.35.0
51
+ - Pytorch 2.1.0+cu121
52
  - Datasets 2.14.6
53
  - Tokenizers 0.14.1