mbart-large-cc25-dequad-qg / eval /metric.first.answer.paragraph.questions_answers.lmqg_qg_dequad.default.json
asahi417's picture
add model
f49feb4
raw
history blame
656 Bytes
{"test": {"QAAlignedF1Score (BERTScore)": 0.906647733818443, "QAAlignedF1Score (MoverScore)": 0.653647697608496, "QAAlignedRecall (BERTScore)": 0.9066477395394856, "QAAlignedPrecision (BERTScore)": 0.9066477395394856, "QAAlignedRecall (MoverScore)": 0.6536478786377148, "QAAlignedPrecision (MoverScore)": 0.6536478786377148}, "validation": {"QAAlignedF1Score (BERTScore)": 0.9203702769756086, "QAAlignedF1Score (MoverScore)": 0.67005561892494, "QAAlignedRecall (BERTScore)": 0.9203702765495896, "QAAlignedPrecision (BERTScore)": 0.9203702765495896, "QAAlignedRecall (MoverScore)": 0.6700558047600201, "QAAlignedPrecision (MoverScore)": 0.6700558047600201}}