It just didn't want to stop talking...

#1
by Boffy - opened

It was a simple question, ooba latest, only setting changed was max_new_tokens to 4096....

had to kill the process for it to stop, the stop button didn't work...

https://pastebin.com/c3UNqShv

No issues with the model stopping for me here. Try using exui as an alternative UI and loader or redownload the model again.

@boffy Try changing prompt template in ooba to chatml. I see "You" and "AI" in your text, this suggest that you are using default chat template that doesn't have chatml prompt structure with <|im_start|> and <|im_end|> tokens. <|im_end|> is the EOS token that should stop the generation here, but since you are using different prompt format, it's kind of expected that it won't be outputted. Also, make sure you didn't disable adding BOS token in ooba settings - according to json files it should be enabled for this model.

Sign up or log in to comment