mt5-small-ruquad-qg / eval /metric.long.sentence.sentence_answer.question.lmqg_qg_ruquad.default.json
asahi417's picture
model update
dbb4d17
raw
history blame
529 Bytes
{"validation": {"Bleu_1": 0.29641836540292826, "Bleu_2": 0.2319057656711498, "Bleu_3": 0.18594132450226736, "Bleu_4": 0.15082911979467992, "METEOR": 0.25756467437539815, "ROUGE_L": 0.305469488896553, "BERTScore": 0.8350848786393265, "MoverScore": 0.6183380878087487}, "test": {"Bleu_1": 0.3009437393518195, "Bleu_2": 0.23636316043023595, "Bleu_3": 0.19050607001349915, "Bleu_4": 0.15584319661234716, "METEOR": 0.25651442102663957, "ROUGE_L": 0.3066223525895429, "BERTScore": 0.8343574191748435, "MoverScore": 0.6154866087547037}}