t5-large-subjqa-vanilla-books-qg / eval /metric.first.answer.sentence_answer.question.lmqg_qg_subjqa.books.json
asahi417's picture
update
a3feda7
raw
history blame
546 Bytes
{"validation": {"Bleu_1": 0.03662258392671758, "Bleu_2": 0.006400379596429068, "Bleu_3": 3.659331287297016e-08, "Bleu_4": 8.885782121236077e-11, "METEOR": 0.027939566441287395, "ROUGE_L": 0.0303604781624224, "BERTScore": 0.7830161977073421, "MoverScore": 0.5131425431257296}, "test": {"Bleu_1": 0.036061820263287886, "Bleu_2": 0.009622104987923681, "Bleu_3": 3.996907155150396e-08, "Bleu_4": 8.276153056231133e-11, "METEOR": 0.02532440376679995, "ROUGE_L": 0.028664004061913845, "BERTScore": 0.7925587979910885, "MoverScore": 0.5160217417890958}}