ItbearZhang's picture
Update README.md
cccabdc
|
raw
history blame
781 Bytes
metadata
license: mit

Introduction

This is ItbearZhang/facebook-opt-125m-with-alpacadataset using the dataset from tatsu-lab/stanford_alpaca: Code and documentation to train Stanford's Alpaca models, and generate the data. (github.com) and the opt-125m pretrained lm model from facebook/opt-125m.

How to use it to interfere?

from transformers import pipeline

def generate_by_pipeline(instruction, inputs=""):
    if inputs == "":
        prompt = f"### Instruction:\n{instruction}\n\n### Response:"
    else:
        prompt = f"### Instruction:\n{instruction}\n\n### Input:\n{inputs}\n\n### Response:"
    return generator(prompt)[0]['generated_text']

print(generate_by_pipeline("What is the capital of China?"))