maxContextLength of just 64 tokens

#8
by ronaldmannak - opened

This model should have a context window of 4K tokens, but when loaded in SwiftChat demo app, the max length is set to 64 tokens. Is there a reason for that? Is this related to the discrete sequence shapes mentioned in the blog or is this something we can easily adjust ourself in the model files?

Sign up or log in to comment