context length

#2
by ehartford - opened
Qwen org

What is the context length of Qwen1.5-32b?

This comment has been hidden
Qwen org

yes it is 32k. all models of qwen1.5 support 32k tokens

yes it is 32k. all models of qwen1.5 support 32k tokens

I would like to ask, when training qwen1.5, is the sequence_length the same as qwen, 2k? Or is it another value? Has the base model released separately gone through long context incremental training? If so, what is the sequence_length of long context training? Thanks~~~

Sign up or log in to comment