Feature Extraction
Transformers
Safetensors
vision-encoder-decoder
custom_code

MultiUniFormerWithProjectionHead.forward() got an unexpected keyword argument 'output_attentions'

#1
by pamessina - opened

Hi. First of all congratulations for the first place in the challenge and thank you for sharing the model.

I'm trying to run the model in inference mode on Colab as suggested in the demo but I'm getting the error in the title.

See the screenshot below:

image.png

Do you have any idea what might be causing the error?

Thanks in advance.

Australian e-Health Research Centre org
edited Aug 26

Hi @pamessina ,

I just ran the notebook again, but there were no issues (the updated notebook can be seen here: https://huggingface.co/aehrc/cxrmate-rrg24/blob/main/demo.ipynb). It may be an issue with the version of the transformers package? What version are you using?

Thanks,
A

Hi @anicolson ,

Thanks for the prompt response. I'm running the notebook on Google Colab. You can find the notebook I'm actually running here: https://colab.research.google.com/drive/1GFLmrHX-2QfwX5CAns7ROgAtnRz-k1Qh?usp=sharing. Notice that I had to modify the notebook a little bit in order to run it, because the original notebook doesn't include pip install commands for the packages datasets and timm, which you need to install on Colab manually. Would you mind sharing a version of the notebook that successfully runs on Google Colab?

Thanks in advance,
Pablo

Australian e-Health Research Centre org
edited Aug 27

Hi @pamessina ,

Can you please try it again (a new notebook has been uploaded)? The right Transformers version needs to be used for it to work, which is now set in the notebook (https://github.com/huggingface/transformers/issues/31171, this issue should be alleviated in later releases of Transformers). The output_attentions issue has also been dealt with (another issue that has popped up with later versions of Transformers).

Thanks,
A

It's working now! Thank you so much.

Sign up or log in to comment