Getting Error while using RetrievalQA.from_chain_type(llm=llm, chain_type="stuff", retriever=docsearch.as_retriever())

#85
by Omkar-LLM - opened

Below is my code:

pc = Pinecone(
        api_key=PINECONE_API_KEY
    )
index = pc.Index("testing")
index_name = "testing"
llm = HuggingFaceEndpoint(repo_id="bigscience/bloom", max_new_tokens=250)
docsearch = PC.from_texts([t.page_content for t in text_chunks], embedding, index_name=index_name)
query= "YOLOv7 outperforms which models"
qa = RetrievalQA.from_chain_type(llm=llm, chain_type="stuff", retriever=docsearch.as_retriever())
qa.invoke(query)

I am getting below Error:
The above exception was the direct cause of the following exception:)](HTTPError: 400 Client Error: Bad Request for url: https://api-inference.huggingface.co/models/gpt2

The above exception was the direct cause of the following exception:
BadRequestError: (Request ID: e_CTTHeOIj4l95tPwHFS0)

Bad request:
The following model_kwargs are not used by the model: ['watermark', 'stop', 'stop_sequences'] (note: typos in the generate arguments will also show up in this list)

Can anyone help me with this?

Omkar-LLM changed discussion title from Getting Error while using RetrievalQA.from_chain_type(llm=llm3, chain_type="stuff", retriever=docsearch.as_retriever()) to Getting Error while using RetrievalQA.from_chain_type(llm=llm, chain_type="stuff", retriever=docsearch.as_retriever())

Sign up or log in to comment