Create README.md
Browse files
README.md
ADDED
|
@@ -0,0 +1,30 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
Based on microsoft/deberta-v3-base, finetuned on a synthetic dataset (6 labels were converted to 3 labels).
|
| 2 |
+
|
| 3 |
+
Performance on test dataset:
|
| 4 |
+
precision recall f1-score support
|
| 5 |
+
|
| 6 |
+
0 0.98 0.99 0.98 94
|
| 7 |
+
1 0.96 0.96 0.96 28
|
| 8 |
+
2 1.00 0.98 0.99 66
|
| 9 |
+
|
| 10 |
+
accuracy 0.98 188
|
| 11 |
+
|
| 12 |
+
macro avg 0.98 0.98 0.98 188
|
| 13 |
+
|
| 14 |
+
weighted avg 0.98 0.98 0.98 188
|
| 15 |
+
|
| 16 |
+
|
| 17 |
+
Performance on similar benchmark:
|
| 18 |
+
precision recall f1-score support
|
| 19 |
+
|
| 20 |
+
0 0.13 0.52 0.21 23
|
| 21 |
+
1 0.44 0.15 0.22 75
|
| 22 |
+
2 0.00 0.00 0.00 19
|
| 23 |
+
|
| 24 |
+
accuracy 0.20 117
|
| 25 |
+
|
| 26 |
+
macro avg 0.19 0.22 0.14 117
|
| 27 |
+
|
| 28 |
+
weighted avg 0.31 0.20 0.18 117
|
| 29 |
+
|
| 30 |
+
|