File size: 282 Bytes
33bed1c
ca9c238
33bed1c
 
 
 
1
2
3
4
5
6
7
# Emoji Offensive Classifier

This Streamlit app combines a fine-tuned Qwen model to translate emoji-heavy Chinese text and a toxic-comment classifier to determine offensiveness.

- 🔤 Translation: `JenniferHJF/qwen1.5-emoji-finetuned`
- 🧠 Classification: `unitary/toxic-bert`