[Cache Request] impira/layoutlm-document-qa

by mobomas - opened

Please add the following model to the neuron cache

AWS Inferentia and Trainium org

Hi @mobomas , layoutlm models are not yet supported by optimum-neuron.
The first step shall be support the export and inference of this model. I opened a feature request in optimum-neuron repo here, I could add the support when I have the bandwidth, but meanwhile feel free to pick the task if you feel like to, it should be very easy to add its support following the guide I mentioned in the issue.

Then for adding the cache, we will need an extra support for caching traced models that I am working on. Will keep you updated when we enable it.

Sign up or log in to comment