mt5-base-dequad-qg-ae / eval /metric.first.answer.paragraph.questions_answers.lmqg_qg_dequad.default.json
asahi417's picture
add model
39544a9
raw
history blame
677 Bytes
{"test": {"QAAlignedF1Score (BERTScore)": 0.06114083814726105, "QAAlignedRecall (BERTScore)": 0.05951994963452167, "QAAlignedPrecision (BERTScore)": 0.06296209770918898, "QAAlignedF1Score (MoverScore)": 0.042392850762007585, "QAAlignedRecall (MoverScore)": 0.041482415598281065, "QAAlignedPrecision (MoverScore)": 0.043389094517977525}, "validation": {"QAAlignedF1Score (BERTScore)": 0.058092628294991754, "QAAlignedRecall (BERTScore)": 0.05777893833607408, "QAAlignedPrecision (BERTScore)": 0.05842912770545256, "QAAlignedF1Score (MoverScore)": 0.040414317104309955, "QAAlignedRecall (MoverScore)": 0.04022585926554541, "QAAlignedPrecision (MoverScore)": 0.04061684401311686}}