xlmr-large-en-CLS-P / test_eval.txt
HHansi's picture
Upload folder using huggingface_hub
cd45517 verified
raw
history blame
1.34 kB
Default classification report:
precision recall f1-score support
F 0.9052 0.8980 0.9016 500
T 0.8988 0.9060 0.9024 500
accuracy 0.9020 1000
macro avg 0.9020 0.9020 0.9020 1000
weighted avg 0.9020 0.9020 0.9020 1000
ADJ
Accuracy = 0.8819444444444444
Weighted Recall = 0.8819444444444444
Weighted Precision = 0.8823379855025425
Weighted F1 = 0.8817554639286388
Macro Recall = 0.8804179566563468
Macro Precision = 0.8828627069133399
Macro F1 = 0.8812515158864904
ADV
Accuracy = 0.8333333333333334
Weighted Recall = 0.8333333333333334
Weighted Precision = 0.8371040723981902
Weighted F1 = 0.8342857142857143
Macro Recall = 0.8333333333333334
Macro Precision = 0.8257918552036199
Macro F1 = 0.8285714285714285
NOUN
Accuracy = 0.9053030303030303
Weighted Recall = 0.9053030303030303
Weighted Precision = 0.9053906716546122
Weighted F1 = 0.9052948769628351
Macro Recall = 0.9052729751058182
Macro Precision = 0.9054125819925076
Macro F1 = 0.9052908002927376
VERB
Accuracy = 0.912751677852349
Weighted Recall = 0.912751677852349
Weighted Precision = 0.9130493576741042
Weighted F1 = 0.912735955309276
Macro Recall = 0.912751677852349
Macro Precision = 0.9130493576741041
Macro F1 = 0.912735955309276