Add `eos_token` to the tokenizer config.

#17
by Wauplin HF staff - opened

This PR will fix the widget for this model by setting explicit eos_token/bos_token. It should not affect existing users loading the model/tokenizers from the libraries.

PR changes have been generated using:

from transformers import AutoTokenizer


tokenizer = AutoTokenizer.from_pretrained("microsoft/DialoGPT-large")

tokenizer.save_pretrained('path/to/local/dir')

(related to PR microsoft/DialoGPT-medium/discussions/17)

Microsoft org

Thanks @Wauplin !

lysandre changed pull request status to merged

Sign up or log in to comment