arielnlee commited on
Commit
c27aff7
1 Parent(s): f784afa

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -8,7 +8,7 @@ datasets:
8
 
9
  # Platypus2-7B
10
 
11
- **NOTE**: There is some issue with LLaMa-2 7B and fine-tuning only works if you use `fp16=False` and `bf16=True` in the HF trainer. Working on a fix for this but if you have any thoughts about this issue or performance, please let us know!
12
 
13
  Platypus-7B is an instruction fine-tuned model based on the LLaMA2-7B transformer architecture.
14
 
 
8
 
9
  # Platypus2-7B
10
 
11
+ **NOTE**: There is some issue with LLaMa-2 7B and fine-tuning only works if you use `fp16=False` and `bf16=True` in the HF trainer. Gathering more intel on this but if you have any thoughts about this issue or performance, please let us know!
12
 
13
  Platypus-7B is an instruction fine-tuned model based on the LLaMA2-7B transformer architecture.
14