Update config.json to include p5 and a few other layers

#1
by ankrgyl - opened

This helps the tests in PR #18414 work.

@nielsr could you check here?

Hi everyone, just wanted to bump this

Hugging Face Internal Testing Organization org

Can we confirm hf-internal-testing/tiny-random-layoutlmv2 isn't used yet in HuggingFace Transformers? cc @Narsil

If helpful, the only reference to it in the repo is here:

$ git grep tiny-random-layoutlmv2
tests/deepspeed/test_model_zoo.py:LAYOUTLMV2_TINY = "hf-internal-testing/tiny-random-layoutlmv2"

I took a look and LAYOUTLMV2_TINY is defined but never used.

Hugging Face Internal Testing Organization org
edited Sep 2, 2022

Ok so discussed this offline with @narsil , it's unclear to both of us why we need to make this "tiny" LayoutLMv2 model a bit bigger. Wouldn't it work with this model already?

Hugging Face Internal Testing Organization org

I am looking into it( sorry for the delay).

Hugging Face Internal Testing Organization org

Ok This modification does make things work (the failure is in detectron2 itself without the modification, which means I'm not willing to invest time to understand what's happening).

What's more important is that this configuration change is compatible with the small weights the are here (I thought I had to modify the config to produce smaller weights, but maybe it was not necessary)

Narsil changed pull request status to merged

Sign up or log in to comment