LLaMA-2-7B / README.md
Shorya22's picture
Update README.md
8a36882 verified
---
license: apache-2.0
language:
- en
pipeline_tag: text-generation
---
Install Libs: !pip install -q -U transformers datasets accelerate peft trl bitsandbytes
My model trained on "garage-bAInd/Open-Platypus" data.
I have taken 1000 samples to fine tune LLaMA-2-7b.
Prompt which i used for prepare dataset:
def chat_template(example):
example["instruction"] = f"### Instruction:\n{example['instruction']}\n\n### Response:\n"
return example
dataset= dataset.map(chat_template)