Context Window

#8
by zappa2005 - opened

This model really has a unique writing style, but as mentioned in the Model card it has been trained to 8k context size. Would it be possible to train it on the 32k size like mentioned in the config.json?

Any reason why it was limited to 8k?

Owner

Yes it doesn't perform well beyond 8k tokens because of the way it's been trained as you said. No plans to expand the context window at the moment.

Sign up or log in to comment