OSError: Deci/DeciLM-7B-instruct-GGUF does not appear to have a file named config.json. Checkout 'https://huggingface.co/Deci/DeciLM-7B-instruct-GGUF/main' for available files.
when i try to load the gguf model. it throws
OSError: Deci/DeciLM-7B-instruct-GGUF does not appear to have a file named config.json. Checkout 'https://huggingface.co/Deci/DeciLM-7B-instruct-GGUF/main' for available files.
how to load this gguf kindly go through the below code i have used..
model_id = "Deci/DeciLM-7B-instruct-GGUF"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id,
model_file="decilm-7b-uniform-gqa-q8_0.gguf")
Hey @narensymb - Take a look at the following notebook, I think it's exactly what you're looking for: https://colab.research.google.com/drive/1y4RCTIfTTb53b_S95xl4IZaW8am6sBxz
Thankyou @harpreetsahota , i have loaded the model but now when i try to use that in RetrievalQA as given below i have got error
qa_chain_with_memory = RetrievalQA.from_chain_type(llm=model , chain_type='stuff',
retriever = db.as_retriever(search_type="mmr", search_kwargs={"k": 8}),
return_source_documents = True,
chain_type_kwargs = {"verbose": True,
"prompt": prompt,
"memory": ConversationBufferMemory(
input_key="question",
return_messages=True)})
ValidationError: 2 validation errors for LLMChain
llm
instance of Runnable expected (type=type_error.arbitrary_type; expected_arbitrary_type=Runnable)
llm
instance of Runnable expected (type=type_error.arbitrary_type; expected_arbitrary_type=Runnable)
Can you help me in this.. Thanks in advance...
Hey @narensymb - I am so terribly sorry, I didn't see this comment until now. Can you provide me more context? Perhaps a reproducible notebook? This seems to be an error originating from LangChain, which has recently undergone a lot of changes.