Wrong "model_max_length" set in tokenizer?

#3
by ntkuhn - opened

It seems like the tokenizer configuration has the setting "model_max_length": 1024, while the model can take inputs up to 2048. Is this an oversight?

Salesforce org

That is correct. Thanks for pointing this out! Fixed.

rooa changed discussion status to closed

Sign up or log in to comment