the model facebook/bart-large-cnn is not working properly while loading it using the class TFBartForConditionalGeneration

#64
by aron1016 - opened

I tried using the model: facebook/bart-large-cnn by loading it using the class TFBartForConditionalGeneration and TFAutoModelForSeq2SeqLM and return tensor = "tf" in the tokenizer. while using the model in predictions the model is not working properly and not getting the expected results.

image.png

pls find the code,

image.png

The model is working fine when loaded using the class, model = AutoModelForSeq2SeqLM.from_pretrained("facebook/bart-large-cnn") and tokenizer = AutoTokenizer.from_pretrained("facebook/bart-large-cnn") and the return tensor ="pt" in tokenizer.
but when used the return tensor as "tf" getting the below error message,
tokenizer([input_text], return_tensors="tf")

image.png

kindly help how to use the model in tensorflow.

I have the same issue while using the pipeline API

AI at Meta org

Hi @aron1016 @levani13 , it's quite hard to figure out the code from screenshots. Can you paste me some sample code or link me to a notebook that reproduces the issue?

@Rocketknight1 for me it's working again, no idea why it got messed up temporarily

AI at Meta org

Hi @levani13 , that's good to hear! We suspect the cause was an issue in loading safetensors in TF. This was fixed by the PR here which was included in in version 4.36. If you updated transformers in the meantime, that might be what happened.

@aron1016 , let me know if upgrading transformers fixes your issues as well!

Yeah I did update it as part of troubleshooting

!pip install transformers==4.36.1

The above code is working fine after upgrading the transformer

thank you

Sign up or log in to comment