Edit model card

Pretraining Text Encoders with Adversarial Mixture of Training Signal Generators

This model card contains the AMOS model (base++ version) proposed in this paper. The official GitHub repository can be found here.

Citation

If you find this model card useful for your research, please cite the following paper:

@inproceedings{meng2022amos,
  title={Pretraining Text Encoders with Adversarial Mixture of Training Signal Generators},
  author={Meng, Yu and Xiong, Chenyan and Bajaj, Payal and Tiwary, Saurabh and Bennett, Paul and Han, Jiawei and Song, Xia},
  booktitle={ICLR},
  year={2022}
}
Downloads last month
2
Unable to determine this model’s pipeline type. Check the docs .