mt5-small-ruquad-qg / eval /metric.middle.sentence.sentence_answer.question.lmqg_qg_ruquad.default.json
asahi417's picture
model update
dbb4d17
raw
history blame
527 Bytes
{"validation": {"Bleu_1": 0.29628712399747975, "Bleu_2": 0.23173953059583766, "Bleu_3": 0.1857628216265251, "Bleu_4": 0.15065522671229967, "METEOR": 0.2574638730879857, "ROUGE_L": 0.3053120917228644, "BERTScore": 0.8350636084642281, "MoverScore": 0.6183359251345745}, "test": {"Bleu_1": 0.3009151127617969, "Bleu_2": 0.2363374838623006, "Bleu_3": 0.19048764934293672, "Bleu_4": 0.15583711125687622, "METEOR": 0.2565280875416422, "ROUGE_L": 0.3066134044354321, "BERTScore": 0.8343673868361797, "MoverScore": 0.6154820228141442}}