Rope scaling and Smaug BPE

#1
by Kearm - opened

Hi I have been following the discussion at https://github.com/ggerganov/llama.cpp/pull/8676 and https://github.com/ggerganov/llama.cpp/issues/8650 and Smaug BPE is NOT the same as Llama 3.1's tokenizer. Have you tested these quants at 110k ctx?

Sign up or log in to comment