mt5-small-dequad-qg-ae / eval /metric.first.answer.paragraph.questions_answers.lmqg_qg_dequad.default.json
asahi417's picture
add model
9fb0a72
raw
history blame
657 Bytes
{"test": {"QAAlignedF1Score (BERTScore)": 0.8001787025118497, "QAAlignedF1Score (MoverScore)": 0.5399337820733731, "QAAlignedRecall (BERTScore)": 0.812320688466097, "QAAlignedPrecision (BERTScore)": 0.7891379324540309, "QAAlignedRecall (MoverScore)": 0.5427046152241427, "QAAlignedPrecision (MoverScore)": 0.537701100527795}, "validation": {"QAAlignedF1Score (BERTScore)": 0.7968514576823058, "QAAlignedF1Score (MoverScore)": 0.5367417390383296, "QAAlignedRecall (BERTScore)": 0.8279070731753848, "QAAlignedPrecision (BERTScore)": 0.7688479039115325, "QAAlignedRecall (MoverScore)": 0.551513774610949, "QAAlignedPrecision (MoverScore)": 0.5235551868389352}}