sparsetral-16x7B-v1 / README.md
francislabounty's picture
Update README.md
b9a9488 verified
|
raw
history blame
716 Bytes
metadata
license: apache-2.0
datasets:
  - Open-Orca/SlimOrca

prompt format

### System:\n{system}\n### Human:\n{user}\n### Assistant:\n"

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained("serpdotai/sparsetral-16x7B-v1", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained("serpdotai/sparsetral-16x7B-v1", device_map="auto", trust_remote_code=True).eval()

inputs = tokenizer('### System:\n\n### Human:\nHow are you?\n### Assistant:\n', return_tensors='pt')
inputs = inputs.to(model.device)
pred = model.generate(**inputs)
print(tokenizer.decode(pred.cpu()[0], skip_special_tokens=True))
# I am doing well, thank you.