Fix wrong model_max_length

#4
by andstor - opened

The model has a context window of 2048 (n_positions). The tokenizer should also support the same length.

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment