SGPT-1.3B-weightedmean-msmarco-specb-bitfit / evaluation /mteb /AmazonCounterfactualClassification.json
Muennighoff's picture
Add MTEB evaluation
5d4e702
raw
history blame
388 Bytes
{
"dataset_version": null,
"mteb_version": "0.0.2",
"test": {
"en": {
"accuracy": 0.652089552238806,
"accuracy_stderr": 0.04707742824740793,
"ap": 0.2959212705444778,
"ap_stderr": 0.022393345886320606,
"f1": 0.5997099864321921,
"f1_stderr": 0.036697739411917986,
"main_score": 0.652089552238806
},
"evaluation_time": 23.71
}
}