non-default generation parameters issue

#39
by mikhail-panzo - opened

I have an issue regarding generation parameters. Here is the full error message:

Some non-default generation parameters are set in the model config. These should go into a GenerationConfig file (https://huggingface.co/docs/transformers/generation_strategies#save-a-custom-decoding-strategy-with-your-model) instead. This warning will be raised to an exception in v4.41.

Non-default generation parameters: {'max_length': 1876}

Your generation config was originally created from the model config, but the model config has changed since then. Unless you pass the generation_config argument to this model's generate calls, they will revert to the legacy behavior where the base generate parameterization is loaded from the model config instead. To avoid this behavior and this warning, we recommend you to overwrite the generation config model attribute before calling the model's save_pretrained, preferably also removing any generation kwargs from the model config. This warning will be raised to an exception in v4.41.

Same issue. Training loss is 0.0; validation loss is nan.

Sign up or log in to comment