name,Faithfulness,Answer_Relevancy,Answer_Correctness,Answer_Similarity Qwen1.5-7B-Chat,0.7784179666191972,0.6446005816609537,0.754576907135137,0.685739970384688 Gemma-2B,0.24509853619131058,0.3864916275690845,0.7478997831333918,0.5331677452195527 Qwen1.5-1.8B-Chat,0.6049019607843137,0.548904353980019,0.7198411238809446,0.9040869829649206 Yi-6B-Chat,0.7810660719751629,0.7167976589186875,0.6766795202404136,0.9015276627615246 Gpt-3.5-Turbo,0.7246376811594203,0.8043651144702015,0.6509014034090643,0.9218230132938211 Vicuna-7B-V1.5,0.4022032693674485,0.6008780718971581,0.6017002573832742,0.8863580258321797 Qwen1.5-14B-Chat,0.6328502415458936,0.7221033096305126,0.5966395914029399,0.9145067669966427 Qwen1.5-4B-Chat,0.5223151244890376,0.7184562339362471,0.5920241633149731,0.9031485610681586 Qwen1.5-0.5B-Chat,0.4166666666666667,0.6888304890847555,0.5800664845337345,0.8918967698708089 Internlm2-Chat-20B,0.7642667437926058,0.4826757943830156,0.5799585238847701,0.8876497593181047 Baichuan2-7B-Chat,0.48357487922705317,0.6222193378376162,0.578165574028535,0.9011250283968237 Vicuna-13B-V1.5,0.5991387785360396,0.6932977841508635,0.5747798091575604,0.8409686000423541 Baichuan2-13B-Chat,0.4896135265700483,0.7843769138572264,0.5653592173980313,0.9083387421281699 Internlm2-Chat-7B,0.558064058956916,0.10309630273051296,0.5526199329356135,0.8526944078477274 Yi-6B,0.3099052131839017,0.3558360880812697,0.5458649977309493,0.8362572777967558 Mistral-7B,0.6256150793650794,0.29009866821782115,0.5221570098966367,0.8399808519337731 Gemma-7B,0.4451515151515152,0.3045735267275342,0.5168775971677172,0.8368893516841295