It seems `T5WithLMHead` is outdated

#5
by Narsil HF staff - opened

Leaving it for potential older versions of the lib.
But adding it so that infer_framework_load_model from the pipeline can load the model directly without needing to refer to the pipeline task.

https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/modeling_t5.py#L1466

T5 community org

BTW on the Hub side we always use the first element of architectures ie. architectures[0] (to populate the "Use in Transformers" modal)

So you need to swap those two if you want the newer kind of AutoModel to appear

I have dived into the old code of Transformers since the rename was done a looooong time ago.
At that time, modeling_auto did not look at the architectures field of the config at all, so you can just replace the name and it will still work with older versions of Transformers.

Cannot merge
This branch has merge conflicts in the following files:
  • config.json

Sign up or log in to comment