xlmr-large-all-CLS-P / test_eval_ar.txt
HHansi's picture
Upload folder using huggingface_hub
6ae7e12 verified
raw
history blame contribute delete
No virus
1.3 kB
Default classification report:
precision recall f1-score support
F 0.8012 0.7820 0.7915 500
T 0.7871 0.8060 0.7964 500
accuracy 0.7940 1000
macro avg 0.7942 0.7940 0.7940 1000
weighted avg 0.7942 0.7940 0.7940 1000
ADJ
Accuracy = 0.7755102040816326
Weighted Recall = 0.7755102040816326
Weighted Precision = 0.776719198317625
Weighted F1 = 0.7757917548290219
Macro Recall = 0.7756813417190775
Macro Precision = 0.7743012098456403
Macro F1 = 0.7746655518394648
ADV
Accuracy = 0.6
Weighted Recall = 0.6
Weighted Precision = 0.7166666666666667
Weighted F1 = 0.6380952380952382
Macro Recall = 0.5625
Macro Precision = 0.5416666666666667
Macro F1 = 0.5238095238095238
NOUN
Accuracy = 0.7975708502024291
Weighted Recall = 0.7975708502024291
Weighted Precision = 0.7981559178688616
Weighted F1 = 0.7973914388629947
Macro Recall = 0.7972459016393443
Macro Precision = 0.7983021847854699
Macro F1 = 0.7973017331932772
VERB
Accuracy = 0.7989949748743719
Weighted Recall = 0.7989949748743719
Weighted Precision = 0.7989949748743719
Weighted F1 = 0.7989949748743719
Macro Recall = 0.798974669798217
Macro Precision = 0.798974669798217
Macro F1 = 0.798974669798217