Upload tokenizer

by ArthurZ HF staff - opened

Update the tokenizer to make sure that the length is the same as config.vocab_size following the issue reported here on github.

Code Llama org

The normalized / not normalized info is not super important for previous transformers versions, so this fixes them. Before the fast had normalized=False but the slow items had normalized=True

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment