Context Length?

#4
by lazyDataScientist - opened

I am guessing the context length is 4k tokens, but llama.cpp is suggesting 2k tokens. Just wanted to be sure what it is.

It should be native 4k yes!

Sign up or log in to comment