mt5-small-lm-adapt / README.md
DKYoon's picture
Update README.md
4f3d5dc
|
raw
history blame
481 Bytes
metadata
license: apache-2.0

🤗 Language model initialized from mT5 and trained for an additional 100K steps on the Prefix LM objective using mC4 data.

Paper: Overcoming Catastrophic Forgetting in Zero-Shot Cross-Lingual Generation

Authors: Tu Vu, Aditya Barua, Brian Lester, Daniel Cer, Mohit Iyyer, Noah Constant


PyTorch port of the original Flax checkpoint found at Google/T5X repository.