Question: How to stop making it respond with questions?

#2
by natek000 - opened

I imported the model into Ollama with a model file based on the Llama3 template, and then tried giving it a system prompt to encourage it to answer questions. Instead, it just asks me questions about the system prompt or loosely related to the query.

E.g.
Prompt: "What is the capital of the United States?"
Answer: "Who makes the laws in the United States?"

(With system prompt)
Answer: "What is the penalty for not answering?" (my system prompt encouraged it to answer all questions)

I saw in another discussion someone was getting it to operate properly, so I'm wondering if there's a specific system prompt I should be using. Thanks!

They include one in the model page they may be worth checking out

Also it's largely meant for use as RAG ideally

I tried the prompt template provided by Nvidia's model card, wrote the Modelfile used for the Ollama installation, and ran the model with the same confusion.
Maybe I didn't write it right, tune it again ......

But on LM Studio, it runs great! Following the guidelines here lmstudio-community/Llama3-ChatQA-1.5-8B-GGUF · Hugging Face

Using it as a RAG worked beautifully, thanks!

Sign up or log in to comment