xlmr-large-ar-CLS-P / test_eval.txt
HHansi's picture
Upload folder using huggingface_hub
107c0c2 verified
raw
history blame
1.27 kB
Default classification report:
precision recall f1-score support
F 0.7196 0.6160 0.6638 500
T 0.6643 0.7600 0.7090 500
accuracy 0.6880 1000
macro avg 0.6920 0.6880 0.6864 1000
weighted avg 0.6920 0.6880 0.6864 1000
ADJ
Accuracy = 0.6326530612244898
Weighted Recall = 0.6326530612244898
Weighted Precision = 0.6483718005815934
Weighted F1 = 0.6308140277387551
Macro Recall = 0.640251572327044
Macro Precision = 0.6431322207958922
Macro F1 = 0.6320400500625782
ADV
Accuracy = 0.8
Weighted Recall = 0.8
Weighted Precision = 0.64
Weighted F1 = 0.7111111111111111
Macro Recall = 0.5
Macro Precision = 0.4
Macro F1 = 0.4444444444444445
NOUN
Accuracy = 0.7145748987854251
Weighted Recall = 0.7145748987854251
Weighted Precision = 0.7180149366290619
Weighted F1 = 0.7130601796741818
Macro Recall = 0.7137213114754098
Macro Precision = 0.7183276673421198
Macro F1 = 0.7127847761994104
VERB
Accuracy = 0.6658291457286433
Weighted Recall = 0.6658291457286433
Weighted Precision = 0.6689094016355324
Weighted F1 = 0.6647185078039871
Macro Recall = 0.6664267495012248
Macro Precision = 0.6686629811629812
Macro F1 = 0.664896209871932