Text Generation
Transformers
Safetensors
llama
conversational
Inference Endpoints
text-generation-inference

How does Prompt formatting change on kobold?

#1
by Nycoorias - opened

KoboldUI doesn’t have an instruction prompt and before I make a larger post about how 0.5 compares to 0.2* I would like to know if I even use the right format.

*SPOILER: So far, I don’t feel it is worse, but neither do I feel it is better.

@Nycoorias Do you have errors in text generations with long context (4k+)?
I have an annoying thing on all backends when it breaks generation by starting a new hallucinated piece with the same characters -> drives me crazy xD

@MateoTeo
Character, like in chat mode?

If yes, then yes.
I have switched over to Bagel-dpo and a good system prompt has fixed it.

@Nycoorias
Nope, like a story type, not just chat with a character. But I guess that it will be the same here.
Random example (just for demonstration): Billy called Janne this morning, she likes this summer and wantedThe New Year is closer and closer with each day of this winter. Billy walks into the mall to pick a suitable gift for Janne.

Something like that. Same characters, same 'world', different situations, and no periods.
And I think that I tried a DPO version too. And I tested multiple versions on different backends, quants, and yeah, system prompts too.
Really weird as it happens only with Bagel out of... 100+ LLM finetues that I tried. Even Airoboros that have a part of Bagel dataset doesn't have it. Mystery nonetheless...

Sign up or log in to comment