xlmr-large-all-ET / test_eval_en.txt
HHansi's picture
Upload folder using huggingface_hub
289560c verified
raw
history blame
1.34 kB
Default classification report:
precision recall f1-score support
F 0.8918 0.8900 0.8909 500
T 0.8902 0.8920 0.8911 500
accuracy 0.8910 1000
macro avg 0.8910 0.8910 0.8910 1000
weighted avg 0.8910 0.8910 0.8910 1000
ADJ
Accuracy = 0.8611111111111112
Weighted Recall = 0.8611111111111112
Weighted Precision = 0.8616044616044616
Weighted F1 = 0.8611916264090177
Macro Recall = 0.8614551083591331
Macro Precision = 0.8606177606177606
Macro F1 = 0.8608695652173913
ADV
Accuracy = 0.7666666666666667
Weighted Recall = 0.7666666666666667
Weighted Precision = 0.8126696832579187
Weighted F1 = 0.7679644048943269
Macro Recall = 0.7916666666666666
Macro Precision = 0.7850678733031675
Macro F1 = 0.7664071190211346
NOUN
Accuracy = 0.8920454545454546
Weighted Recall = 0.8920454545454546
Weighted Precision = 0.892332073969671
Weighted F1 = 0.892031900152266
Macro Recall = 0.8920941243991678
Macro Precision = 0.89229112833764
Macro F1 = 0.8920357728360341
VERB
Accuracy = 0.9161073825503355
Weighted Recall = 0.9161073825503355
Weighted Precision = 0.9176311030741412
Weighted F1 = 0.9160307924664405
Macro Recall = 0.9161073825503356
Macro Precision = 0.9176311030741411
Macro F1 = 0.9160307924664405