Sending Multiple question prompt

#1
by 123www - opened

How can I send a multiple question prompt at once?

Hi,

This model is fine-tuned on DocVQA, a dataset of document-question pairs, so it's not really possible to ask follow-up questions in a conversational setup.

However if you mean sending a batch of images + questions through the model, that's possible. You can just provide a batch of pixel_values + decoder_input_ids to the generate method, and use the batch_decode method of the tokenizer to turn the generated ID's into text.

123www changed discussion status to closed

Could you provide an example on this??

However if you mean sending a batch of images + questions through the model, that's possible. You can just provide a batch of pixel_values + decoder_input_ids to the generate method, and use the batch_decode method of the tokenizer to turn the generated ID's into text.

Does anyone have an example on how to do this? Thanks.

Sign up or log in to comment