Prompt format support

#2
by limcheekin - opened

Hi there,

Look like the converted model doesn't support the prompt format the model being trained for.

Please see the following code of the prompt format:
https://huggingface.co/spaces/huggingface-projects/llama-2-13b-chat/blob/main/model.py#L24

Any idea how to make the converted model work with the specific prompt format above?

Thanks.

Best regards.

I think you got some key concepts wrong here. High level: The highlighted function formats a chat dict to a string. The model‘s tokenizer supports tokenisation of strings.

Detailed: If you got you wrong here, what explicitly is not supported here? Any tokenisation issues?

Note, this model is the base model, not the finetuned one for chat

Sorry for the silly mistake. I used the prompt in a wrong model.

Thanks for the heads up.

limcheekin changed discussion status to closed

Sign up or log in to comment