xlmr-large-all-CLS / test_eval_ru.txt
HHansi's picture
Upload folder using huggingface_hub
bb7eab8 verified
raw
history blame contribute delete
No virus
1.26 kB
Default classification report:
precision recall f1-score support
F 0.5778 0.9060 0.7056 500
T 0.7824 0.3380 0.4721 500
accuracy 0.6220 1000
macro avg 0.6801 0.6220 0.5888 1000
weighted avg 0.6801 0.6220 0.5888 1000
ADJ
Accuracy = 0.5333333333333333
Weighted Recall = 0.5333333333333333
Weighted Precision = 0.7946666666666666
Weighted F1 = 0.487962962962963
Macro Recall = 0.631578947368421
Macro Precision = 0.72
Macro F1 = 0.513888888888889
ADV
Accuracy = 0.4375
Weighted Recall = 0.4375
Weighted Precision = 0.775
Weighted F1 = 0.32792207792207795
Macro Recall = 0.55
Macro Precision = 0.7
Macro F1 = 0.37662337662337664
NOUN
Accuracy = 0.6323024054982818
Weighted Recall = 0.6323024054982818
Weighted Precision = 0.66929951017804
Weighted F1 = 0.6048521703327158
Macro Recall = 0.624290780141844
Macro Precision = 0.6716118292205249
Macro F1 = 0.60142089093702
VERB
Accuracy = 0.6209677419354839
Weighted Recall = 0.6209677419354839
Weighted Precision = 0.7037903225806452
Weighted F1 = 0.5819198695583084
Macro Recall = 0.6259432734842572
Macro Precision = 0.7016666666666667
Macro F1 = 0.5839223245520098