File size: 900 Bytes
eb91892
 
9f33f03
 
 
 
 
 
 
 
 
 
eb91892
9f33f03
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
---
license: mit
license_link: https://huggingface.co/microsoft/phi-2/resolve/main/LICENSE

language:
- en

pipeline_tag: text-generation

tags:
- nlp
- code
---

## Model Summary

Phi-mmlu-lora is a LORA model which fine-tuned on gsm8k dataset. The base model is  [microsoft/phi-2](https://huggingface.co/microsoft/phi-2).


## How to Use


```python
import torch
from transformers import AutoTokenizer
from peft import AutoPeftModelForCausalLM

torch.set_default_device("cuda")

model = AutoPeftModelForCausalLM.from_pretrained("liuchanghf/phi2-mmlu-lora")
tokenizer = AutoTokenizer.from_pretrained("microsoft/phi-2", trust_remote_code=True)

inputs = tokenizer('''def print_prime(n):
   """
   Print all primes between 1 and n
   """''', return_tensors="pt", return_attention_mask=False)

outputs = model.generate(**inputs, max_length=200)
text = tokenizer.batch_decode(outputs)[0]
print(text)
```