Context-Window

#6
by HuggySSO - opened

Hi!

What is the context-size of the 7b model and is it the same for the 34b model?

Defog.ai org

100k context length for both, though you’ll get better performance (and latency) if you keep it under 4096!

rishdotblog changed discussion status to closed

HI @rishdotblog , what approach you would recommend if my context length exceed 100K and parallely, if I host this model, it will take longer time to generate the output right.

image.png
Can you help with this?

Sign up or log in to comment