chtan commited on
Commit
5bf3d06
1 Parent(s): 900082a

Fix the typo of batch size.

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -18,7 +18,7 @@ torchrun --nproc_per_node=8 finetune.py \
18
  --data_path './alpaca_data_gpt4.json' \
19
  --output_dir './gpt4-alpaca-lora_mlp-30b' \
20
  --batch_size 128 \
21
- --micro_batch_size 2 \
22
  --num_epochs 10 \
23
  --learning_rate 1e-4 \
24
  --cutoff_len 512 \
 
18
  --data_path './alpaca_data_gpt4.json' \
19
  --output_dir './gpt4-alpaca-lora_mlp-30b' \
20
  --batch_size 128 \
21
+ --micro_batch_size 8 \
22
  --num_epochs 10 \
23
  --learning_rate 1e-4 \
24
  --cutoff_len 512 \