Muennighoff's picture
Add MTEB evaluation
039f451
raw
history blame contribute delete
373 Bytes
{
"test": {
"en": {
"accuracy": 0.39191999999999994,
"accuracy_stderr": 0.023273538622220733,
"f1": 0.38580766731113825,
"f1_stderr": 0.018793905233795604,
"main_score": 0.39191999999999994
},
"evaluation_time": 2060.82
},
"dataset_version": null,
"mteb_version": "0.0.2"
}