UserWarning: Using the model-agnostic default `max_length` (=20)

#5
by tombenj - opened

Getting this even though I'm adding the below params to the Training Parameters:
"max_seq_length": 512,
"max_target_length": 256,
"max_length": 1024,
"max_new_tokens": 100,

The warning:
67%|โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–‹ | 18000/27000 [1:28:30<36:56, 4.06it/s]/app/env/lib/python3.10/site-packages/transformers/generation/utils.py:1178: UserWarning: Using the model-agnostic default max_length (=20) to control the generation length. We recommend setting max_new_tokens to control the ma

Using seq2seq with google-t5 t5-base. Would love any suggestions on how to force it.

Sign up or log in to comment