File size: 781 Bytes
891893e
 
 
dae5488
 
 
 
 
19359c0
 
 
 
 
 
 
 
 
 
cccabdc
19359c0
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
---
license: mit
---
# Introduction
This is ItbearZhang/facebook-opt-125m-with-alpacadataset using the dataset from [tatsu-lab/stanford_alpaca: Code and documentation to train Stanford's Alpaca models, and generate the data. (github.com)](https://github.com/tatsu-lab/stanford_alpaca)
and the opt-125m pretrained lm model from facebook/opt-125m.

# How to use it to interfere?
``` python
from transformers import pipeline

def generate_by_pipeline(instruction, inputs=""):
    if inputs == "":
        prompt = f"### Instruction:\n{instruction}\n\n### Response:"
    else:
        prompt = f"### Instruction:\n{instruction}\n\n### Input:\n{inputs}\n\n### Response:"
    return generator(prompt)[0]['generated_text']

print(generate_by_pipeline("What is the capital of China?"))
```