mt5-small-ruquad-qg-ae / eval /metric.first.answer.paragraph.questions_answers.lmqg_qg_ruquad.default.json
asahi417's picture
add model
8bddb9f
raw
history blame
659 Bytes
{"test": {"QAAlignedF1Score (BERTScore)": 0.7973821321725308, "QAAlignedF1Score (MoverScore)": 0.5668638524057164, "QAAlignedRecall (BERTScore)": 0.8383368717732346, "QAAlignedPrecision (BERTScore)": 0.7615392588632003, "QAAlignedRecall (MoverScore)": 0.5979098322311048, "QAAlignedPrecision (MoverScore)": 0.5410917350743871}, "validation": {"QAAlignedF1Score (BERTScore)": 0.7984313799677859, "QAAlignedF1Score (MoverScore)": 0.5680695737182676, "QAAlignedRecall (BERTScore)": 0.8398488256323812, "QAAlignedPrecision (BERTScore)": 0.7622311359168065, "QAAlignedRecall (MoverScore)": 0.5992589770914049, "QAAlignedPrecision (MoverScore)": 0.542067345992069}}