| ---
|
| license: mit
|
| ---
|
|
|
| # Pretraining Text Encoders with Adversarial Mixture of Training Signal Generators
|
|
|
| This model card contains the AMOS model (**base++** version) proposed in [this paper](). The official GitHub repository can be found [here](https://github.com/microsoft/AMOS).
|
|
|
| # Citation
|
| If you find this model card useful for your research, please cite the following paper:
|
| ```
|
| @inproceedings{meng2022amos,
|
| title={Pretraining Text Encoders with Adversarial Mixture of Training Signal Generators},
|
| author={Meng, Yu and Xiong, Chenyan and Bajaj, Payal and Tiwary, Saurabh and Bennett, Paul and Han, Jiawei and Song, Xia},
|
| booktitle={ICLR},
|
| year={2022}
|
| }
|
| ``` |