Transformers
GGUF
English
phi-msft

Long conversations issue

#1
by vbuhoijymzoi - opened

On long form conversations (couple screens of text generated by llama.cpp) the model crumbles and starts to generate gibberish.

Check your context length. Phi 2 tops out at 2048 tokens, like the original LLaMA-1.

Sign up or log in to comment