xlmr-large-ru-CLS-E / test_eval.txt
HHansi's picture
Upload folder using huggingface_hub
f64535e verified
raw
history blame
1.32 kB
Default classification report:
precision recall f1-score support
F 0.7143 0.6300 0.6695 500
T 0.6691 0.7480 0.7063 500
accuracy 0.6890 1000
macro avg 0.6917 0.6890 0.6879 1000
weighted avg 0.6917 0.6890 0.6879 1000
ADJ
Accuracy = 0.6666666666666666
Weighted Recall = 0.6666666666666666
Weighted Precision = 0.6817496229260934
Weighted F1 = 0.6712962962962963
Macro Recall = 0.6602870813397129
Macro Precision = 0.6515837104072397
Macro F1 = 0.6527777777777778
ADV
Accuracy = 0.625
Weighted Recall = 0.625
Weighted Precision = 0.6041666666666666
Weighted F1 = 0.6045454545454545
Macro Recall = 0.5666666666666667
Macro Precision = 0.5833333333333333
Macro F1 = 0.5636363636363636
NOUN
Accuracy = 0.6683848797250859
Weighted Recall = 0.6683848797250859
Weighted Precision = 0.6740847478686927
Weighted F1 = 0.6671928136556647
Macro Recall = 0.6704609929078014
Macro Precision = 0.6729447742399712
Macro F1 = 0.6676696400834332
VERB
Accuracy = 0.7258064516129032
Weighted Recall = 0.7258064516129032
Weighted Precision = 0.7267608322198892
Weighted F1 = 0.7252500523669879
Macro Recall = 0.7251279382426923
Macro Precision = 0.7269653423499578
Macro F1 = 0.7250115955473099