gpt2-squad-nqg-hl / README.md
p208p2002's picture
Update README.md
a245624
metadata
datasets:
  - squad
tags:
  - question-generation
widget:
  - text: >-
      Harry Potter is a series of seven fantasy novels written by British
      author, [HL]J. K. Rowling[HL].

Transformer QG on SQuAD

HLQG is Proposed by Ying-Hong Chan & Yao-Chung Fan. (2019). A Re-current BERT-based Model for Question Generation.

This is a Reproduce Version

More detail: p208p2002/Transformer-QG-on-SQuAD

Usage

Input Format

C' = [c1, c2, ..., [HL], a1, ..., a|A|, [HL], ..., c|C|]

Input Example

Harry Potter is a series of seven fantasy novels written by British author, [HL]J. K. Rowling[HL].

Who wrote Harry Potter?

Data setting

We report two dataset setting as Follow

SQuAD

SQuAD NQG

Available models

  • BART
  • GPT2
  • T5

Expriments

We report score with NQG Scorer which is using in SQuAD NQG.

If not special explanation, the size of the model defaults to "base".

SQuAD

Model Bleu 1 Bleu 2 Bleu 3 Bleu 4 METEOR ROUGE-L
BART-HLSQG 54.67 39.26 30.34 24.15 25.43 52.64
GPT2-HLSQG 49.31 33.95 25.41 19.69 22.29 48.82
T5-HLSQG 54.29 39.22 30.43 24.26 25.56 53.11

SQuAD NQG

Model Bleu 1 Bleu 2 Bleu 3 Bleu 4 METEOR ROUGE-L
BERT-HLSQG (Chan et al.) 49.73 34.60 26.13 20.33 23.88 48.23
BART-HLSQG 54.12 38.19 28.84 22.35 24.55 51.03
GPT2-HLSQG 49.82 33.69 24.71 18.63 21.90 47.60
T5-HLSQG 53.13 37.60 28.62 22.38 24.48 51.20