xlmr-large-all-CLS-B / test_eval_ru.txt
HHansi's picture
Upload folder using huggingface_hub
8861816 verified
raw
history blame contribute delete
No virus
1.32 kB
Default classification report:
precision recall f1-score support
F 0.7025 0.7980 0.7472 500
T 0.7662 0.6620 0.7103 500
accuracy 0.7300 1000
macro avg 0.7343 0.7300 0.7287 1000
weighted avg 0.7343 0.7300 0.7287 1000
ADJ
Accuracy = 0.6666666666666666
Weighted Recall = 0.6666666666666666
Weighted Precision = 0.7022222222222222
Weighted F1 = 0.6726998491704373
Macro Recall = 0.6794258373205742
Macro Precision = 0.6666666666666667
Macro F1 = 0.6606334841628959
ADV
Accuracy = 0.4375
Weighted Recall = 0.4375
Weighted Precision = 0.5113636363636364
Weighted F1 = 0.42647058823529416
Macro Recall = 0.4833333333333333
Macro Precision = 0.4818181818181818
Macro F1 = 0.43529411764705883
NOUN
Accuracy = 0.7508591065292096
Weighted Recall = 0.7508591065292096
Weighted Precision = 0.7516175271390544
Weighted F1 = 0.7502539921270467
Macro Recall = 0.7495035460992907
Macro Precision = 0.7519425645432736
Macro F1 = 0.7497353226394783
VERB
Accuracy = 0.7150537634408602
Weighted Recall = 0.7150537634408602
Weighted Precision = 0.7253327932984116
Weighted F1 = 0.7125312341480828
Macro Recall = 0.7166276346604216
Macro Precision = 0.7245212909412364
Macro F1 = 0.7129295282469423