Model Summary
bloomz-3b-mmlu-lora is a LORA model which fine-tuned on mmlu dataset. The base model is bigscience/bloomz-3b
.
How to Use
import torch
from transformers import AutoTokenizer
from peft import AutoPeftModelForCausalLM
torch.set_default_device("cuda")
model = AutoPeftModelForCausalLM.from_pretrained("liuchanghf/bloomz-3b-mmlu-lora")
tokenizer = AutoTokenizer.from_pretrained("bigscience/bloomz-3b")
inputs = tokenizer('''def print_prime(n):
"""
Print all primes between 1 and n
"""''', return_tensors="pt", return_attention_mask=False)
outputs = model.generate(**inputs, max_length=200)
text = tokenizer.batch_decode(outputs)[0]
print(text)