Extend context window

#12
by levulinh - opened

Hi, congratulations on releasing this excellent model.
Is there a way to extend the context window of this model beyond 4096 tokens? Say, RoPE scaling (even though I know that this model wasn't trained with RoPE). It would be helpful for RAG applications.
Thank you.

LG AI Research org

In response to feedback requesting contexts longer than 4K tokens, we are happy to inform you that we have released EXAONE 3.5, which supports contexts longer than 32K tokens.

Thank you.

Sign up or log in to comment