t5-squad-nqg-hl / README.md
1
---
2
datasets:
3
- squad
4
tags:
5
- question-generation
6
widget:
7
- text: "Harry Potter is a series of seven fantasy novels written by British author, [HL]J. K. Rowling[HL]."
8
---
9
10
# Transformer QG on SQuAD
11
HLQG is Proposed by [Ying-Hong Chan & Yao-Chung Fan. (2019). A Re-current BERT-based Model for Question Generation.](https://www.aclweb.org/anthology/D19-5821/)
12
13
**This is a Reproduce Version**
14
15
More detail: [p208p2002/Transformer-QG-on-SQuAD](https://github.com/p208p2002/Transformer-QG-on-SQuAD)
16
17
## Usage
18
### Input Format
19
```
20
C' = [c1, c2, ..., [HL], a1, ..., a|A|, [HL], ..., c|C|]
21
```
22
### Input Example
23
```
24
Harry Potter is a series of seven fantasy novels written by British author, [HL]J. K. Rowling[HL].
25
```
26
> # Who wrote Harry Potter?
27
28
## Data setting
29
We report two dataset setting as Follow
30
31
### SQuAD
32
- train: 87599\\t
33
- validation: 10570
34
> [SQuAD: 100,000+ Questions for Machine Comprehension of Text](https://arxiv.org/abs/1606.05250)
35
36
### SQuAD NQG
37
- train: 75722
38
- dev: 10570
39
- test: 11877
40
> [Learning to Ask: Neural Question Generation for Reading Comprehension](https://arxiv.org/abs/1705.00106)
41
42
## Available models
43
- BART
44
- GPT2
45
- T5
46
47
## Expriments
48
We report score with `NQG Scorer` which is using in SQuAD NQG.
49
50
If not special explanation, the size of the model defaults to "base".
51
52
### SQuAD
53
Model                            |Bleu 1|Bleu 2|Bleu 3|Bleu 4|METEOR|ROUGE-L|
54
---------------------------------|------|------|------|------|------|-------|
55
BART-HLSQG                       |54.67 |39.26 |30.34 |24.15 |25.43 |52.64  |
56
GPT2-HLSQG                       |49.31 |33.95 |25.41| 19.69 |22.29 |48.82  |
57
T5-HLSQG                         |54.29 |39.22 |30.43 |24.26 |25.56 |53.11  |
58
59
### SQuAD NQG
60
Model                            |Bleu 1|Bleu 2|Bleu 3|Bleu 4|METEOR|ROUGE-L|
61
---------------------------------|------|------|------|------|------|-------|
62
BERT-HLSQG (Chan et al.)         |49.73 |34.60 |26.13 |20.33 |23.88 |48.23  |
63
BART-HLSQG                       |54.12 |38.19 |28.84 |22.35 |24.55 |51.03  |
64
GPT2-HLSQG                       |49.82 |33.69 |24.71 |18.63 |21.90 |47.60  |
65
T5-HLSQG                         |53.13 |37.60 |28.62 |22.38 |24.48 |51.20  |