No description provided.

This will support the generate function on the LM head model. Thereby also supporting the generation pipeline

shashwat1002 changed pull request status to open
Stanford NLP org

Thanks for working on this! As far as I can tell, all the kwargs stuff that gets built in prepare_inputs_for_generation doesn't actually get used by the Backpack anywhere. I believe some changes need to be made for the kwargs to actually get passed by the Backpack down to the underlying Transformer.

Hi @johnhew

While that is true, the function has to be overridden for huggingface to consider that generation is supported.
So the code as per this branch is simply equivalent in capability, except it also happens to support the generation pipeline.

(this would also fix the demo)

It is my intention to implement passing of attention masks in actuality to the underlying model later.

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment