File size: 687 Bytes
dd1f54e
 
 
fcdd551
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
---

license: mit
---


# Pretraining Text Encoders with Adversarial Mixture of Training Signal Generators

This model card contains the AMOS model (**base++** version) proposed in [this paper](). The official GitHub repository can be found [here](https://github.com/microsoft/AMOS).

# Citation
If you find this model card useful for your research, please cite the following paper:
```

@inproceedings{meng2022amos,

  title={Pretraining Text Encoders with Adversarial Mixture of Training Signal Generators},

  author={Meng, Yu and Xiong, Chenyan and Bajaj, Payal and Tiwary, Saurabh and Bennett, Paul and Han, Jiawei and Song, Xia},

  booktitle={ICLR},

  year={2022}

}

```