thx

#1
by Danioken - opened

"I found success with:
temperature 0.9-1.2
min_p 0.08
tfs 0.97
smoothing_factor 0.3
smoothing_curve 1.1"

In fact, these settings work great. So far I've relied more on top-k for "diversity". But your suggestion made me curious, I tried it and was surprised - I haven't used "tfs" at all until now. And it turns out that it is a great alternative to high top-k and temperatures. It works great not only in your model but also in all the other models I looked at today.

Thank you, thanks to you I learned something new today that will make it easier for me to test models.

As for the model itself, I can't get over the fact that it's only 15B! When I used the settings before: top-k: 80, temo 1.1, min_p: 0.05. The model sometimes went crazy - not often, but still, I had to repeat generation many times. With your suggested settings, the results are truly impressive. The model does not lose its creativity, but it sticks to the threads and instructions better.

Thank you for the model you created.

I have a set of presets I keep going back to, even with this model it makes it more attentive.
Using the recommended presets my model cards which have other people in the backstory causes the model to confuse user and those other people
Using the nymeria preset it seems to be much better at differentiating between user and mentioned people.
This model makes me disappointed there wasn't a new llama between the 11-15b area, even in IQ3_M this model is impressive
Nymeria:

temp 0.9
top_k 30
top_p 0.75
min_p 0.2
rep_pen 1.1
smooth_factor 0.25
smooth_curve 1

https://huggingface.co/tannedbum/L3-Nymeria-8B

That's true. The 70b has a lot of potential compared to the small Llama... the 15-20b model would be perfect for a huge number of people.

I will test these settings, thank you.

I have a set of presets I keep going back to, even with this model it makes it more attentive.
Using the recommended presets my model cards which have other people in the backstory causes the model to confuse user and those other people
Using the nymeria preset it seems to be much better at differentiating between user and mentioned people.
This model makes me disappointed there wasn't a new llama between the 11-15b area, even in IQ3_M this model is impressive
Nymeria:

temp 0.9
top_k 30
top_p 0.75
min_p 0.2
rep_pen 1.1
smooth_factor 0.25
smooth_curve 1

https://huggingface.co/tannedbum/L3-Nymeria-8B

What settings would you recommend for writing stories?

No problem, thank you so much for the feedback!
TFS is pretty cool since it cuts out the tail end which are "garbage" tokens and that means you can use a lower min_p that has more diversity.

I was also really disappointed by the lack of a llama 13B this time. Because it's literally the best medium size, I imagine a native L3 13B with 16k context would be godly. Tried the Nymeria preset that SaisExperiments suggested and it gained a lot of coherence and stopped confusing my characters as well. But i feel like it lost some creativity, but it's a really good preset. I think it's because Nymeria and this is similar, I will add it in the recommended samplers.

Sign up or log in to comment