Edit model card

Install Libs: !pip install -q -U transformers datasets accelerate peft trl bitsandbytes

My model trained on "garage-bAInd/Open-Platypus" data.

I have taken 1000 samples to fine tune LLaMA-2-7b.

Prompt which i used for prepare dataset:

def chat_template(example):

example["instruction"] = f"### Instruction:\n{example['instruction']}\n\n### Response:\n"

return example

dataset= dataset.map(chat_template)

Downloads last month
15
Safetensors
Model size
6.74B params
Tensor type
FP16
·
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.