Fixed: TypeError: BARTDecoder.prepare_inputs_for_inference() got an unexpected keyword argument 'past_key_values'

#4

Description

I encounter the following error when running locally.
I just realize that transformers has a breaking changes since version 4.25.0 (Dec 2, 2022) (https://pypi.org/project/transformers/#history)
To fix this problem, simply install transformers==4.24.0

Traceback (most recent call last):
  File "/media/data/thinhhnt/HuggingFace/Spaces/donut-base-finetuned-cord-v2/env/lib/python3.10/site-packages/gradio/routes.py", line 401, in run_predict
    output = await app.get_blocks().process_api(
  File "/media/data/thinhhnt/HuggingFace/Spaces/donut-base-finetuned-cord-v2/env/lib/python3.10/site-packages/gradio/blocks.py", line 1302, in process_api
    result = await self.call_function(
  File "/media/data/thinhhnt/HuggingFace/Spaces/donut-base-finetuned-cord-v2/env/lib/python3.10/site-packages/gradio/blocks.py", line 1025, in call_function
    prediction = await anyio.to_thread.run_sync(
  File "/media/data/thinhhnt/HuggingFace/Spaces/donut-base-finetuned-cord-v2/env/lib/python3.10/site-packages/anyio/to_thread.py", line 31, in run_sync
    return await get_asynclib().run_sync_in_worker_thread(
  File "/media/data/thinhhnt/HuggingFace/Spaces/donut-base-finetuned-cord-v2/env/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 937, in run_sync_in_worker_thread
    return await future
  File "/media/data/thinhhnt/HuggingFace/Spaces/donut-base-finetuned-cord-v2/env/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 867, in run
    result = context.run(func, *args)
  File "/media/data/thinhhnt/HuggingFace/Spaces/donut-base-finetuned-cord-v2/app.py", line 17, in demo_process
    output = pretrained_model.inference(image=input_img, prompt=task_prompt)["predictions"][0]
  File "/media/data/thinhhnt/HuggingFace/Spaces/donut-base-finetuned-cord-v2/env/lib/python3.10/site-packages/donut/model.py", line 464, in inference
    decoder_output = self.decoder.model.generate(
  File "/media/data/thinhhnt/HuggingFace/Spaces/donut-base-finetuned-cord-v2/env/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "/media/data/thinhhnt/HuggingFace/Spaces/donut-base-finetuned-cord-v2/env/lib/python3.10/site-packages/transformers/generation/utils.py", line 1437, in generate
    return self.greedy_search(
  File "/media/data/thinhhnt/HuggingFace/Spaces/donut-base-finetuned-cord-v2/env/lib/python3.10/site-packages/transformers/generation/utils.py", line 2245, in greedy_search
    model_inputs = self.prepare_inputs_for_generation(input_ids, **model_kwargs)
TypeError: BARTDecoder.prepare_inputs_for_inference() got an unexpected keyword argument 'past_key_values'
thinh-researcher changed pull request title from Update requirements.txt to Fixed: TypeError: BARTDecoder.prepare_inputs_for_inference() got an unexpected keyword argument 'past_key_values'
NAVER CLOVA INFORMATION EXTRACTION org

Great! Thank you for the PR :) @thinh-researcher

gwkrsrch changed pull request status to merged

Sign up or log in to comment