Prompt template

#1
by tomasmcm - opened

Hi there, what's the best prompt template to use with this model? Based on the tokenizer_config.json it should be:

<|system|>
{system}<eos>
<|user|>
{prompt}<eos>
<|assistant|>

But it seems to hallucinate a lot.

With ChatML it appers to work better, although it includes "<|im_end|>" in the response

<|im_start|>system
{system}<|im_end|>
<|im_start|>user
{user}<|im_end|>
<|im_start|>assistant

This model is a base pre-trained model. It does not support chat mode. We will release the instruction-tuned version soon.

zijianhu changed discussion status to closed
TensorOpera org

@tomasmcm We just released the instruct model. Please checkout in https://huggingface.co/tensoropera/Fox-1-1.6B-Instruct-v0.1.

Sign up or log in to comment