Increase the context length for this model TheBloke/Mistral-7B-Instruct-v0.1-GGUF?

#14
by Rishu9401 - opened

How do I increase the context length for this model TheBloke/Mistral-7B-Instruct-v0.1-GGUF?
WARNING:ctransformers:Number of tokens (7779) exceeded maximum context length (512).

In the config dictionary, you can set your desired context length.
Example:
config = {'max_new_tokens': 256, 'temperature': 0.8, 'context_length': 256}
llm = CTransformers(model="TheBloke/Mistral-7B-Instruct-v0.1-GGUF",
model_file="mistral-7b-instruct-v0.1.Q4_K_M.gguf",
config=config)

Thank You. I will try this way

Sign up or log in to comment