Multiscale Positive-Unlabeled Detection of AI-Generated Texts
Paper โข 2305.18149 โข Published
How to use yuchuantian/AIGC_detector_env2 with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-classification", model="yuchuantian/AIGC_detector_env2") # Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("yuchuantian/AIGC_detector_env2")
model = AutoModelForSequenceClassification.from_pretrained("yuchuantian/AIGC_detector_env2")[arXiv] [Codes (Model Links, Other Detector Versions)]
The AIGC Detector (MPU) in our paper "Multiscale Positive-Unlabeled Detection of AI-Generated Texts".
Paper Link: https://arxiv.org/pdf/2305.18149.pdf
BibTex formatted citation:
@misc{tian2023multiscale,
title={Multiscale Positive-Unlabeled Detection of AI-Generated Texts},
author={Yuchuan Tian and Hanting Chen and Xutao Wang and Zheyuan Bai and Qinghua Zhang and Ruifeng Li and Chao Xu and Yunhe Wang},
year={2023},
eprint={2305.18149},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
En_v2: This model is trained with MPU from a pretrained RoBERTa-Base, but the standard training setting in the paper is not followed.