<im_end> stop token

#2
by djuna - opened

I got some thing like this
.... better.<|im_end|>
<|im_start|>user
I'll ....
Are those config match up with the model? Is there a fix for this?

You're welcome to use it.This is a model that incorporates many elements, and I've encountered issues while using it. Now that llama 3.1 has been released, I'm focusing all my energy on fine-tuning llama 3.1.

Stop Strings

    stop = [
      "## Instruction:",
      "### Instruction:",
      "<|end_of_text|>",
      "  //:",
      "</s>",
      "<3```",
      "### Note:",
      "### Input:",
      "### Response:",
      "### Emoticons:"
    ],

The latest updated model is worth using, and I currently use it for my daily tasks.
https://huggingface.co/aifeifei798/DarkIdol-Llama-3.1-8B-Instruct-1.2-Uncensored

I'll try it

Thanks, that one doesn't have the issue.

djuna changed discussion status to closed

Sign up or log in to comment