Transformers
llama
text-generation-inference

This model LLongMA-2-7B ha 8K contest?

#2
by mirek190 - opened

Do I properly understand it?
If yes then under llama.cpp I just need --ctx-size 8186 ?

I'd say use
-c 8192 --rope-freq-base 10000 --rope-freq-scale 0.5"
As TheBloke wrote.

Ohh I see right now ... thanks

Sign up or log in to comment