What is the max context length of this model?

#66
by flexwang - opened

In the config, the max_position_embedding is 32768, however, I read it from somewhere, the model was trained on 8k context length.

This comment has been hidden

Sign up or log in to comment