Text Generation
Transformers
PyTorch
English
German
llama
conversational
Inference Endpoints
text-generation-inference

System Template not used in example

#5
by SebastianBodza - opened

Just noticed, that the system_prompt is not used in the example

I'm also would like to know, how to use the system_prompt. I like to force the model to react always in the same way, like: always return the top 5 Keywords that describe the users input, which is provided by the user-prompt. I think, the system_prompt is the right place for that, but I have no Idea, how to use that.

Unfortunately all the LeoLM-Model-Communities seams deserted. There are almost no answers to the questions, no matter which model.

Sign up or log in to comment