bug with "forced_decoder_ids=forced_decoder_ids"

#1
by sharockys - opened

Bug info:

ValueError: A custom logits processor of type <class 'transformers.generation.logits_process.ForceTokensLogitsProcessor'> with values <transformers.generation.logits_process.ForceTokensLogitsProcessor object at 0x7f889cf07a70> has been passed to `.generate()`, but it has already been created with the values <transformers.generation.logits_process.ForceTokensLogitsProcessor object at 0x7f8896be8770>. <transformers.generation.logits_process.ForceTokensLogitsProcessor object at 0x7f8896be8770> has been created by passing the corresponding arguments to generate or by the model's config default values. If you just want to change the default values of logits processor consider passing them as arguments to `.generate()` instead of using a custom logits processor.

versions of libs:
torch==2.2.0
torchaudio==2.2.0
tokenizers==0.15.1
transformers==4.37.2

Efficient NLP org

Thanks for reporting this; I've fixed the problem in the code example.

The issue was that forced_decoder_ids was already included in the config.json, so it does not need to be specified again in the code. Giving this information twice was not a problem previously, except in the latest version of the transformers library, where it produces an error.

sharockys changed discussion status to closed

Sign up or log in to comment