File size: 468 Bytes
cbb1e4a
 
 
f0f6303
5a8a751
f0f6303
 
 
 
 
6c8b827
1
2
3
4
5
6
7
8
9
10
11
---
license: apache-2.0
---

🤗 Language model initialized from mT5 and trained for an additional 100K steps on the Prefix LM objective using mC4 data.

Paper: [Overcoming Catastrophic Forgetting in Zero-Shot Cross-Lingual Generation](https://arxiv.org/abs/2205.12647)

Authors: Tu Vu, Aditya Barua, Brian Lester, Daniel Cer, Mohit Iyyer, Noah Constant

PyTorch port of the original Flax checkpoint at [Google/T5X repository](https://github.com/google-research/t5x).