Warning during inference

#1
by vince62s - opened

Hello,
why are we getting this warning:
Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
The attention mask and the pad token id were not set. As a consequence, you may observe unexpected behavior. Please pass your input's attention_mask to obtain reliable results.
Setting pad_token_id to eos_token_id:32005 for open-end generation.

does not prevent from working fine, but just asking.

Sign up or log in to comment