Set max_length to 50 to avoid deprecation message

#2
by orena - opened

When running without max_length we are getting this error:

Setting `pad_token_id` to `eos_token_id`:50256 for open-end generation.
/home/oamsalem/.local/lib/python3.9/site-packages/transformers/generation_utils.py:1359: UserWarning: Neither `max_length` nor `max_new_tokens` has been set, `max_length` will default to 50 (`self.config.max_length`). Controlling `max_length` via the config is deprecated and `max_length` will be removed from the config in v5 of Transformers -- we recommend using `max_new_tokens` to control the maximum length of the generation.
  warnings.warn(

Better to set it to avoid this message

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment