Edit model card

Summary

This model is a merge of following 4 models

  1. HWERI/Llama2-7b-sharegpt4
  2. circulus/Llama-2-7b-orca-v1
  3. starmpcc/Asclepius-Llama2-7B
  4. meta-llama/Llama-2-7b-chat-hf

Merge is done through averaging of the parameters.

merged_wt.copy_( (base_wt + asclepius_wt + sharegpt_wt + orca_wt) / 4.0 )

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer
import torch

MODEL_ID = "prhegde/multi-merged-llama2-7b"
tokenizer = AutoTokenizer.from_pretrained(MODEL_ID)
model = AutoModelForCausalLM.from_pretrained(MODEL_ID)

Prompt = "You are given a passage PASSAGE and a boolean question QUESTION. Return answer as True or False. \n PASSAGE: Calcium carbide is a chemical compound with the chemical formula of CaC. Its main use industrially is in the production of acetylene and calcium cyanamide. \n QUESTION: is calcium carbide cac2 the raw material for the production of acetylene? \n ANSWER: "

input_ids = tokenizer(Prompt, return_tensors="pt").input_ids
output = model.generate(input_ids)
output_seq = tokenizer.decode(output[0], skip_special_tokens=True)
print(output_seq)
Downloads last month
231
Safetensors
Model size
6.74B params
Tensor type
F32
·
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.