aanaphi2-v0.1 / README.md
appoose's picture
removing last line
d692cf2 verified
|
raw
history blame
No virus
2.52 kB
metadata
license: mit
train: false
inference: false
pipeline_tag: text-generation

aanaphi2-v0.1 is a finetuned (SFT + DPO) chat model based on Microsoft's Phi-2 base model (2.8B parameters).

image/png

Performance

Models phi-2 aanaphi2-v0.1
ARC (25-shot) 61.09 63.73
HellaSwag (10-shot) 75.11 78.30
MMLU (5-shot) 58.11 57.70
TruthfulQA-MC2 44.47 51.55
Winogrande (5-shot) 74.35 73.40
GSM8K (5-shot) 54.81 58.60
Average 61.33 63.88

Basic Usage

#Load model
import transformers, torch
compute_dtype = torch.float16
cache_path    = ''
device        = 'cuda'
model_id      = "mobiuslabsgmbh/aanaphi2-v0.1"
model         = transformers.AutoModelForCausalLM.from_pretrained(model_id, torch_dtype=compute_dtype, 
                                                                  cache_dir=cache_path,
                                                                  device_map=device)
tokenizer     = transformers.AutoTokenizer.from_pretrained(model_id, cache_dir=cache_path)

#Set Prompt format
instruction_template = "### Human: "
response_template    = "### Assistant: "
def prompt_format(prompt):
    out = instruction_template + prompt + '\n' + response_template
    return out
model.eval();

@torch.no_grad()
def generate(prompt, max_length=1024):
    prompt_chat = prompt_format(prompt)
    inputs      = tokenizer(prompt_chat, return_tensors="pt", return_attention_mask=True).to('cuda')
    outputs     = model.generate(**inputs, max_length=max_length, eos_token_id= tokenizer.eos_token_id) 
    text        = tokenizer.batch_decode(outputs[:,:-1])[0]
    return text

#Generate
print(generate('If A+B=C and B=C, what would be the value of A?'))

Installation

As the time of writing Phi-2 is only supported has been integrated in the development version (4.37.0.dev) of transformers. Until the official version is released through pip, ensure that you update your local transformers to the development version: pip uninstall -y transformers && pip install git+https://github.com/huggingface/transformers.