Prompts for Question Answering Assistant

#21
by pratikhublikar - opened

I am building a question answering assistant using the model. What prompts can I use so that the responses generated by the model are brief, to the point and coherent?

An example using Langchain's prompts:

from langchain.llms import CTransformers
from langchain.prompts import PromptTemplate
from langchain.chains import LLMChain

model_path : str = "models/Llama-2-7B-Chat-GGML/llama-2-7b-chat.ggmlv3.q4_0.bin"

llm = CTransformers(
            model=model_path, 
            model_type='llama', 
        )

prompt = PromptTemplate(
        input_variables=["product"],
        template="What is a good name for a company that makes {product}? Answer with a simple list only.",
    )

llmchain = LLMChain(llm=llm, prompt=prompt)
print(llmchain.run("podcast player"))

The prompt template of Llama2 is <s>[INST]\n<<SYS>>\n{system_prompt}\n<</SYS>>\n\n{user_prompt}[/INST] (ref. https://huggingface.co/spaces/huggingface-projects/llama-2-7b-chat/blob/main/model.py). Does CTransformers library handles this automatically? Otherwise how can it work your example? Thanks, Vincenzo

The PR that dealing with this issue just got merged: https://github.com/huggingface/transformers/pull/25323

Sign up or log in to comment