File size: 1,623 Bytes
68b575b 61d785c 68b575b 61d785c ac317de 61d785c 30ab76e 61d785c ac317de 61d785c ac317de 61d785c a378d97 61d785c |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 |
---
language:
- en
datasets:
- kyujinpy/Open-platypus-Commercial
library_name: transformers
pipeline_tag: text-generation
license: mit
---
# **phi-2-platypus-Commercial-lora**
## Model Details
**Model Developers**
- field2437
**Base Model**
- microsoft/phi-2(https://huggingface.co/microsoft/phi-2)
**Training Dataset**
- kyujinpy/Open-platypus-Commercial(https://huggingface.co/datasets/kyujinpy/Open-platypus-Commercial)
---
# Model comparisons1
> AI-Harness evaluation; [link](https://github.com/EleutherAI/lm-evaluation-harness)
| Model | Copa | HellaSwag | BoolQ | MMLU |
| --- | --- | --- | --- | --- |
| | 0-shot | 0-shot | 0-shot | 0-shot |
| **phi-2-platypus-Commercial-lora** | 0.8900 | 0.5573 | 0.8260 | 0.5513 |
---
# Sample Code
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
torch.set_default_device("cuda")
model = AutoModelForCausalLM.from_pretrained("field2437/phi-2-platypus-Commercial-lora", torch_dtype="auto", trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained("field2437/phi-2-platypus-Commercial-lora", trust_remote_code=True)
inputs = tokenizer('''Below is an instruction that describes a task. Write a response that appropriately completes the request.
### Instruction:
Let $f(x)$ be the polynomial \\[f(x)=3x^4+5x^2-9x-2.\\] If $g(x)$ is equal to the polynomial $f(x-1)$, what is the sum of the coefficients of $g$?
### Response:
''', return_tensors="pt", return_attention_mask=False)
outputs = model.generate(**inputs, max_length=200)
text = tokenizer.batch_decode(outputs)[0]
print(text)
```
---
|