Set tokenizer model_max_length to 2048

#10
by joaogante HF staff - opened

As described in the FLAN-UL2 blog, the receptive field of the model was increased from 512 to 2048.

There is also a n_positions in the model config, set to 512, but I can't see its use in transformers πŸ€”

thanks for fixing!

ybelkada changed pull request status to merged

Sign up or log in to comment