What model_type to set between "None" and "llama"? And what prompt style to use? There are plenty of the latter in Ooba's TextGenWebUI as of now and I'm pretty much lost (see pics for clarification)

#10
by sneedingface - opened

Also if these infos are generally available somewhere in the config files of the repo and I can find them autonomously please let me know so I won't bother you all ever again with these noob-ish questions, thank you!
ModelType.png
PromptStyle.png

Use llama for model type and alpaca for prompt style.

Also if these infos are generally available somewhere in the config files of the repo and I can find them autonomously please let me know so I won't bother you all ever again with these noob-ish questions, thank you!
ModelType.png
PromptStyle.png

@MetaIX I swear I don't want to bother you but I would really appreciate if you could reply to that, I'm trying to learn what I can. Thank you and sorry if I'm annoying

I've discovered over the years that it is far more annoying to constantly apologize rather than just ask with confidence lol. :D
I hear ya tho, I also sometimes struggle to find the right settings and it can be frustrating when nobody wants to help

I've discovered over the years that it is far more annoying to constantly apologize rather than just ask with confidence lol. :D
I hear ya tho, I also sometimes struggle to find the right settings and it can be frustrating when nobody wants to help

Got it, now fuck off. How am I doing?

Sign up or log in to comment