Multi-hop Question Generation with Graph Convolutional Network
- URL: http://arxiv.org/abs/2010.09240v2
- Date: Tue, 9 Feb 2021 08:59:23 GMT
- Title: Multi-hop Question Generation with Graph Convolutional Network
- Authors: Dan Su, Yan Xu, Wenliang Dai, Ziwei Ji, Tiezheng Yu, Pascale Fung
- Abstract summary: Multi-hop Question Generation (QG) aims to generate answer-related questions by aggregating and reasoning over multiple scattered evidence from different paragraphs.
We propose Multi-Hop volution Fusion Network for Question Generation (MulQG), which does context encoding in multiple hops.
Our proposed model is able to generate fluent questions with high completeness and outperforms the strongest baseline by 20.8% in the multi-hop evaluation.
- Score: 58.31752179830959
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Multi-hop Question Generation (QG) aims to generate answer-related questions
by aggregating and reasoning over multiple scattered evidence from different
paragraphs. It is a more challenging yet under-explored task compared to
conventional single-hop QG, where the questions are generated from the sentence
containing the answer or nearby sentences in the same paragraph without complex
reasoning. To address the additional challenges in multi-hop QG, we propose
Multi-Hop Encoding Fusion Network for Question Generation (MulQG), which does
context encoding in multiple hops with Graph Convolutional Network and encoding
fusion via an Encoder Reasoning Gate. To the best of our knowledge, we are the
first to tackle the challenge of multi-hop reasoning over paragraphs without
any sentence-level information. Empirical results on HotpotQA dataset
demonstrate the effectiveness of our method, in comparison with baselines on
automatic evaluation metrics. Moreover, from the human evaluation, our proposed
model is able to generate fluent questions with high completeness and
outperforms the strongest baseline by 20.8% in the multi-hop evaluation. The
code is publicly available at
https://github.com/HLTCHKUST/MulQG}{https://github.com/HLTCHKUST/MulQG .
Related papers
- Improving Question Generation with Multi-level Content Planning [70.37285816596527]
This paper addresses the problem of generating questions from a given context and an answer, specifically focusing on questions that require multi-hop reasoning across an extended context.
We propose MultiFactor, a novel QG framework based on multi-level content planning. Specifically, MultiFactor includes two components: FA-model, which simultaneously selects key phrases and generates full answers, and Q-model which takes the generated full answer as an additional input to generate questions.
arXiv Detail & Related papers (2023-10-20T13:57:01Z) - Towards Graph-hop Retrieval and Reasoning in Complex Question Answering
over Textual Database [15.837457557803507]
Graph-Hop is a novel multi-chains and multi-hops retrieval and reasoning paradigm in complex question answering.
We construct a new benchmark called ReasonGraphQA, which provides explicit and fine-grained evidence graphs for complex questions.
arXiv Detail & Related papers (2023-05-23T16:28:42Z) - Understanding and Improving Zero-shot Multi-hop Reasoning in Generative
Question Answering [85.79940770146557]
We decompose multi-hop questions into multiple corresponding single-hop questions.
We find marked inconsistency in QA models' answers on these pairs of ostensibly identical question chains.
When trained only on single-hop questions, models generalize poorly to multi-hop questions.
arXiv Detail & Related papers (2022-10-09T11:48:07Z) - Modeling Multi-hop Question Answering as Single Sequence Prediction [88.72621430714985]
We propose a simple generative approach (PathFid) that extends the task beyond just answer generation.
PathFid explicitly models the reasoning process to resolve the answer for multi-hop questions.
Our experiments demonstrate that PathFid leads to strong performance gains on two multi-hop QA datasets.
arXiv Detail & Related papers (2022-05-18T21:57:59Z) - Ask to Understand: Question Generation for Multi-hop Question Answering [11.626390908264872]
Multi-hop Question Answering (QA) requires the machine to answer complex questions by finding scattering clues and reasoning from multiple documents.
We propose a novel method to complete multi-hop QA from the perspective of Question Generation (QG)
arXiv Detail & Related papers (2022-03-17T04:02:29Z) - QA4QG: Using Question Answering to Constrain Multi-Hop Question
Generation [54.136509061542775]
Multi-hop question generation (MQG) aims to generate complex questions which require reasoning over multiple pieces of information of the input passage.
We propose a novel framework, QA4QG, a QA-augmented BART-based framework for MQG.
Our results on the HotpotQA dataset show that QA4QG outperforms all state-of-the-art models.
arXiv Detail & Related papers (2022-02-14T08:16:47Z) - Unsupervised Multi-hop Question Answering by Question Generation [108.61653629883753]
MQA-QG is an unsupervised framework that can generate human-like multi-hop training data.
Using only generated training data, we can train a competent multi-hop QA which achieves 61% and 83% of the supervised learning performance.
arXiv Detail & Related papers (2020-10-23T19:13:47Z) - Reinforced Multi-task Approach for Multi-hop Question Generation [47.15108724294234]
We take up Multi-hop question generation, which aims at generating relevant questions based on supporting facts in the context.
We employ multitask learning with the auxiliary task of answer-aware supporting fact prediction to guide the question generator.
We demonstrate the effectiveness of our approach through experiments on the multi-hop question answering dataset, HotPotQA.
arXiv Detail & Related papers (2020-04-05T10:16:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.