passing parameters to the underlying model's forward

#3
by mahdi-b - opened

The ESM2 forward function shuld also be able to take a return_contacts param but passing this param to the transformers.model returns an error. In general, how can one pass parameters to a model wrapper by transformers.models class?
Thank you,

Hi @mahdi-b ! This parameter is currently not supported in our port of ESM, but we're aiming to add it in the next couple of weeks. Once it's added you should be able to pass that argument.

Hi @mahdi-b , this feature has just been added. It is currently only available on the main branch, but will be included in the next release of transformers, which will probably be in January.

You can install from main with the command pip install --upgrade git+https://github.com/huggingface/transformers.git. After that, you can use the predict_contacts method on any ESM-2 or ESM-1b model. The method also exists on ESM-1v models, but the head has not been trained there and so the results will be incorrect. Because we just added this feature about ten minutes ago, please let me know if you encounter any issues!

@Rocketknight1 , thanks for adding that functionality and for taking the time to msg back to inform me. I was going to send a message saying it wasn't working for me ( I was using return_contacts=True as an argument to model), but I saw your most push in which you implemented it as a separate method (predict_contacts). Works better since I don't actually need the embeddings.
That said, it's customary to pass tokens to hugging face models using the input_ids parameter rather than tokens as implemented in predict_contacts, was that deliberate decision?
Thank you!

Sign up or log in to comment