max_positional_embeddings

#2
by ehartford - opened

I saw that max_positional_embeddings is 4096

that means it has 4k context right?

I would be interested in this model if it has a 16k or 32k context, but 4k is too small

DeepSeek org

Yes, the context length we trained is 4K. The next version of our model might support a longer context.

luofuli changed discussion status to closed

Sign up or log in to comment