YAML Metadata Warning: The pipeline tag "text2text-generation" is not in the official list: text-classification, token-classification, table-question-answering, question-answering, zero-shot-classification, translation, summarization, feature-extraction, text-generation, fill-mask, sentence-similarity, text-to-speech, text-to-audio, automatic-speech-recognition, audio-to-audio, audio-classification, audio-text-to-text, voice-activity-detection, depth-estimation, image-classification, object-detection, image-segmentation, text-to-image, image-to-text, image-to-image, image-to-video, unconditional-image-generation, video-classification, reinforcement-learning, robotics, tabular-classification, tabular-regression, tabular-to-text, table-to-text, multiple-choice, text-ranking, text-retrieval, time-series-forecasting, text-to-video, image-text-to-text, image-text-to-image, image-text-to-video, visual-question-answering, document-question-answering, zero-shot-image-classification, graph-ml, mask-generation, zero-shot-object-detection, text-to-3d, image-to-3d, image-feature-extraction, video-text-to-text, keypoint-detection, visual-document-retrieval, any-to-any, video-to-video, other
Text Style Transfer using CycleGANs
This repository contains the models from the paper "Self-supervised Text Style Transfer using Cycle-Consistent Adversarial Networks" (ACM TIST 2024).
The work introduces a novel approach to Text Style Transfer using CycleGANs with sequence-level supervision and Transformer architectures.
Available Models
Formality transfer
GYAFC dataset (Family & Relationships)
| model | checkpoint |
|---|---|
| BART base | informal-to-formal, formal-to-informal |
| BART large | informal-to-formal, formal-to-informal |
| T5 small | informal-to-formal, formal-to-informal |
| T5 base | informal-to-formal, formal-to-informal |
| T5 large | informal-to-formal, formal-to-informal |
| BERT base | style classifier |
GYAFC dataset (Entertainment & Music)
| model | checkpoint |
|---|---|
| BART base | informal-to-formal, formal-to-informal |
| BART large | informal-to-formal, formal-to-informal |
| T5 small | informal-to-formal, formal-to-informal |
| T5 base | informal-to-formal, formal-to-informal |
| T5 large | informal-to-formal, formal-to-informal |
| BERT base | style classifier |
Sentiment transfer
Yelp dataset
| model | checkpoint |
|---|---|
| BART base | negative-to-positive, positive-to-negative |
| BART large | negative-to-positive, positive-to-negative |
| T5 small | negative-to-positive, positive-to-negative |
| T5 base | negative-to-positive, positive-to-negative |
| T5 large | negative-to-positive, positive-to-negative |
| BERT base | style classifier |
Model Description
The models implement a CycleGAN architecture for Text Style Transfer that:
- Applies self-supervision directly at sequence level
- Maintains content while transferring style attributes
- Employs pre-trained style classifiers to guide generation
- Uses Transformer-based generators and discriminators
The models achieve state-of-the-art results on both formality and sentiment transfer tasks.
Usage
Both generators and style classifiers can be used with the Hugging Face 🤗 transformers library:
Each generator model can be loaded as:
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
model = AutoModelForSeq2SeqLM.from_pretrained("[GENERATOR_MODEL]")
tokenizer = AutoTokenizer.from_pretrained("[GENERATOR_MODEL]")
The style classifiers can be loaded as:
from transformers import AutoModelForSequenceClassification, AutoTokenizer
classifier = AutoModelForSequenceClassification.from_pretrained("[CLASSIFIER_MODEL]")
tokenizer = AutoTokenizer.from_pretrained("[CLASSIFIER_MODEL]")
Citation
For more details, you can refer to the paper.
@article{10.1145/3678179,
author = {La Quatra, Moreno and Gallipoli, Giuseppe and Cagliero, Luca},
title = {Self-supervised Text Style Transfer Using Cycle-Consistent Adversarial Networks},
year = {2024},
issue_date = {October 2024},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
volume = {15},
number = {5},
issn = {2157-6904},
url = {https://doi.org/10.1145/3678179},
doi = {10.1145/3678179},
journal = {ACM Trans. Intell. Syst. Technol.},
month = nov,
articleno = {110},
numpages = {38},
keywords = {Text Style Transfer, Sentiment transfer, Formality transfer, Cycle-consistent Generative Adversarial Networks, Transformers}
}
Code
The full implementation is available at: https://github.com/gallipoligiuseppe/TST-CycleGAN.
License
This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
- Downloads last month
- 6