BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Paper
β’
1810.04805
β’
Published
β’
25
K-urse_Detection_with_BERT : Korean Cursing expression Detection with fine-tuned klue_BERT
This is the KWU "text mining" output for the first semester of 2023.
See Project Overview Here! : Notion(Korean)
See this model on GitHub : Link
| Model/Metric | Accuracy | Precision | Recall | F1 Score |
|---|---|---|---|---|
| Comparison(Electra base) | 0.81 | 0.69 | 0.87 | 0.77 |
| klue-BERT base(Our best result) | 0.83 | 0.76** | 0.75 | 0.75 |
| Model/Metric | Accuracy | Precision | Recall | F1 Score |
|---|---|---|---|---|
| Comparison(Electra base) | 0.77 | 0.52 | 0.90 | 0.66 |
| klue-BERT base(Our best result) | 0.89 | 0.75 | 0.80 | 0.78 |
Try Demo Here! Go to HuggingFace Space