Text Generation
Transformers
Safetensors
llama
generated_from_trainer
axolotl
conversational
Inference Endpoints
text-generation-inference

Does the custom System Prompt does not work with the llama 3 models?

#14
by ishanparihar - opened

I have been struggling with system prompt template of Llama 3 models. I put it in the LM Studio and the response generation is always generic.

Let me know if this is actually an issue or how I can fix this.
Thanks.

I have the exact same struggle, the usual system prompt does lead to strangely generic answers. The same system prompt and same (!) prompt will lead to extremely different results in the normal Llama 3 model, even when asked very specific questions.

Cognitive Computations org

I cannot speak to Lmstudio, but I haven't experienced issues with the system prompt on ollama. I have heard reports of getting different results with the same prompt/system prompt with Llama-3-8b in general. I would experiment with different top_p (~0.7) and playing around with the temperature settings.

Cognitive Computations org

image.png

Sign up or log in to comment