Pretraining Text Encoders with Adversarial Mixture of Training Signal Generators

This model card contains the AMOS model (base++ version) proposed in this paper. The official GitHub repository can be found here.

Citation

If you find this model card useful for your research, please cite the following paper:

@inproceedings{meng2022amos,
  title={Pretraining Text Encoders with Adversarial Mixture of Training Signal Generators},
  author={Meng, Yu and Xiong, Chenyan and Bajaj, Payal and Tiwary, Saurabh and Bennett, Paul and Han, Jiawei and Song, Xia},
  booktitle={ICLR},
  year={2022}
}
Downloads last month
43
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support