Defaults to short responses.

#2
by Ingenitus - opened

I can't figure out how to make the model be descriptive like Emerthyst-20B (that model instead has problems with acting as the MC, which this model does not have). It defaults to small chat-like responses you would expect from smaller models even though I tell it to use long-form paragraphs. For a 70B model that's not great, and it has some problems sometimes with logical coherency for advanced RP prompts. I found Euryale-70B is better for my use case.

What system prompt are you using? Some post-release testing has shown that the model might actually perform a bit better if you use Alpaca-style prompting like:

Below is an instruction that describes a task. Write a response that appropriately completes the request.

Instruction:

[prompt here]

Response:

I use the "### Instruction" header at the beginning of context.

What system prompt are you using? Some post-release testing has shown that the model might actually perform a bit better if you use Alpaca-style prompting like:

Below is an instruction that describes a task. Write a response that appropriately completes the request.

Instruction:

[prompt here]

Response:

Sign up or log in to comment