xlmr-large-all-CLS / test_eval_en.txt
HHansi's picture
Upload folder using huggingface_hub
bb7eab8 verified
raw
history blame contribute delete
No virus
1.28 kB
Default classification report:
precision recall f1-score support
F 0.6504 0.9080 0.7579 500
T 0.8477 0.5120 0.6384 500
accuracy 0.7100 1000
macro avg 0.7491 0.7100 0.6982 1000
weighted avg 0.7491 0.7100 0.6982 1000
ADJ
Accuracy = 0.7361111111111112
Weighted Recall = 0.7361111111111112
Weighted Precision = 0.76440329218107
Weighted F1 = 0.7321540625338093
Macro Recall = 0.7438080495356036
Macro Precision = 0.7592592592592593
Macro F1 = 0.7335929892891917
ADV
Accuracy = 0.6
Weighted Recall = 0.6
Weighted Precision = 0.575
Weighted F1 = 0.5619047619047619
Macro Recall = 0.5416666666666667
Macro Precision = 0.5625
Macro F1 = 0.5238095238095238
NOUN
Accuracy = 0.7291666666666666
Weighted Recall = 0.7291666666666666
Weighted Precision = 0.764611623288735
Weighted F1 = 0.7194976076555023
Macro Recall = 0.728466891455628
Macro Precision = 0.7649770352126739
Macro F1 = 0.7192982456140351
VERB
Accuracy = 0.674496644295302
Weighted Recall = 0.674496644295302
Weighted Precision = 0.729611190137506
Weighted F1 = 0.65371668164121
Macro Recall = 0.674496644295302
Macro Precision = 0.729611190137506
Macro F1 = 0.6537166816412099