Instructions to use qiyuw/pcl-roberta-large with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use qiyuw/pcl-roberta-large with Transformers:
# Load model directly from transformers import AutoTokenizer, RobertaForCL tokenizer = AutoTokenizer.from_pretrained("qiyuw/pcl-roberta-large") model = RobertaForCL.from_pretrained("qiyuw/pcl-roberta-large") - Notebooks
- Google Colab
- Kaggle
Model Description
Refer to https://github.com/qiyuw/PeerCL
Citation
Cite our paper if PCL helps your work:
@inproceedings{wu-etal-2022-pcl,
title = "{PCL}: Peer-Contrastive Learning with Diverse Augmentations for Unsupervised Sentence Embeddings",
author = "Wu, Qiyu and Tao, Chongyang and Shen, Tao and Xu, Can and Geng, Xiubo and Jiang, Daxin",
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2022",
address = "Abu Dhabi, United Arab Emirates",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.emnlp-main.826",
pages = "12052--12066",
}
- Downloads last month
- 7