Question Generator
Collection
10 items
•
Updated
HLQG is Proposed by Ying-Hong Chan & Yao-Chung Fan. (2019). A Re-current BERT-based Model for Question Generation.
This is a Reproduce Version
More detail: p208p2002/Transformer-QG-on-SQuAD
C' = [c1, c2, ..., [HL], a1, ..., a|A|, [HL], ..., c|C|]
Harry Potter is a series of seven fantasy novels written by British author, [HL]J. K. Rowling[HL].
Who wrote Harry Potter?
We report two dataset setting as Follow
Learning to Ask: Neural Question Generation for Reading Comprehension
We report score with NQG Scorer
which is using in SQuAD NQG.
If not special explanation, the size of the model defaults to "base".
Model | Bleu 1 | Bleu 2 | Bleu 3 | Bleu 4 | METEOR | ROUGE-L |
---|---|---|---|---|---|---|
BART-HLSQG | 54.67 | 39.26 | 30.34 | 24.15 | 25.43 | 52.64 |
GPT2-HLSQG | 49.31 | 33.95 | 25.41 | 19.69 | 22.29 | 48.82 |
T5-HLSQG | 54.29 | 39.22 | 30.43 | 24.26 | 25.56 | 53.11 |
Model | Bleu 1 | Bleu 2 | Bleu 3 | Bleu 4 | METEOR | ROUGE-L |
---|---|---|---|---|---|---|
BERT-HLSQG (Chan et al.) | 49.73 | 34.60 | 26.13 | 20.33 | 23.88 | 48.23 |
BART-HLSQG | 54.12 | 38.19 | 28.84 | 22.35 | 24.55 | 51.03 |
GPT2-HLSQG | 49.82 | 33.69 | 24.71 | 18.63 | 21.90 | 47.60 |
T5-HLSQG | 53.13 | 37.60 | 28.62 | 22.38 | 24.48 | 51.20 |