No description provided.

This PR fixes the tokenizer of OPT-30b to have a padding token

patrickvonplaten changed pull request status to merged

Maybe we could store the tokenizer.json as prettified JSON as well, would make it easier to see the modified lines in the PRs?

Exactly what @sgugger said today as well. Agree 100%

Should open a PR on Transformers across all processor classes I think. Can do it on Monday if no one beats me to do it beforehand

Sign up or log in to comment