why UMT5

#1
by pszemraj - opened

Why does this use UMT5 for the model class/arch (for a model trained primarily on English), yet the card says nothing about it?

From some test fine-tuning of this model, the gradients do not seem to update except for the LM head when using run_summarization.py, which might be related to this.. t5-v1_1 in this model's place works fine

EleutherAI org

Hi, UMT5 model checkpoints were originally trained with T5x while T5v1.1 uses the text-to-text repository. I used T5x for this and since it’s compatible, I figured it would be easier to use UMT5. Please also note this is still a WIP and an official release/blogpost is coming soon.

I can also check. What script was this from?

hey! sorry for the delay. So in the process of going through my stuff/writing this response, I realized that this model uses a verbatim T5 Tokenizer, while both the smaller (base) and larger (xl) checkpoints use the llama tokenizer. is this model supposed to also use that ?

EleutherAI org

Thanks for letting me know. I'd updated it.

lintang changed discussion status to closed

awesome thanks! let me know if I should create an issue elsewhere, but either I'm doing something wrong, or the UMT5 arch has a bug with params not updating for anything but the task-specific head. Have you guys finetuned literally your checkpoints on hf with any of the example scripts or similar?

Running summarization with your pile t5 base

image.png

if I update the state_dict etc to use standard T5 arch/ T5ForConditionalGeneration

image.png

if you find it useful/want to explore further the wandb project is open here

EleutherAI org

awesome thanks! let me know if I should create an issue elsewhere, but either I'm doing something wrong, or the UMT5 arch has a bug with params not updating for anything but the task-specific head. Have you guys finetuned literally your checkpoints on hf with any of the example scripts or similar?

Running summarization with your pile t5 base

if I update the state_dict etc to use standard T5 arch/ T5ForConditionalGeneration

if you find it useful/want to explore further the wandb project is open here

This seems like a HF-specific bug. Very frustrating, but we did also release the T5x-compatible checkpoints which don't have this issue (add -t5x to the end of the URL).

Sign up or log in to comment