Update tokenizer_config.json

#2
by lysandre HF staff - opened

The model max length is currently set at a value within the transformers toolkit directly, as can be seen here: https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py#L90-L91

We're removing these definition as legacy, we're therefore updating the remote checkpoints to ensure that this information is not lost. Thank you!

sajvir changed pull request status to merged

Sign up or log in to comment