What is the template for prompting here?

#6
by sauravm8 - opened

I am trying to load and work with it on AutoGPTQ, but I am not getting the appropriate responses.

Code
final_message = input()
inputs = tokenizer(final_message, return_tensors="pt").to(model.device)
tokens = model.generate(**inputs, max_new_tokens=100, do_sample=True, temperature=0.8)
print(tokenizer.decode(tokens[0]))

Input and Output

Hi

< s > Hi there!
I'm Haisam. I am a software engineer in the San Francisco Bay Area and I have a passion for technology and programming. I enjoy building scalable and efficient applications and have experience in various programming languages such as Java, Python, and JavaScript. I also have experience working with databases and various web frameworks such as Spring and Django.
I hold a Bachelor's degree in Computer Science and have worked in the industry for several years. I enjoy keeping up with the

Note Added space within < s > to not enable strikethrough.
Thanks in advance TheBloke

This is the correct format:

USER: What is 4x3?
ASSISTANT:

Sign up or log in to comment