Improve Query Focused Abstractive Summarization by Incorporating Answer
Relevance
- URL: http://arxiv.org/abs/2105.12969v2
- Date: Mon, 31 May 2021 04:38:58 GMT
- Title: Improve Query Focused Abstractive Summarization by Incorporating Answer
Relevance
- Authors: Dan Su, Tiezheng Yu, Pascale Fung
- Abstract summary: We propose QFS-BART, a model that incorporates the explicit answer relevance of the source documents given the query via a question answering model.
Our model can take advantage of large pre-trained models which improve the summarization performance significantly.
Empirical results on the Debatepedia dataset show that the proposed model achieves the new state-of-the-art performance.
- Score: 43.820971952979875
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Query focused summarization (QFS) models aim to generate summaries from
source documents that can answer the given query. Most previous work on QFS
only considers the query relevance criterion when producing the summary.
However, studying the effect of answer relevance in the summary generating
process is also important. In this paper, we propose QFS-BART, a model that
incorporates the explicit answer relevance of the source documents given the
query via a question answering model, to generate coherent and answer-related
summaries. Furthermore, our model can take advantage of large pre-trained
models which improve the summarization performance significantly. Empirical
results on the Debatepedia dataset show that the proposed model achieves the
new state-of-the-art performance.
Related papers
- IDEAL: Leveraging Infinite and Dynamic Characterizations of Large Language Models for Query-focused Summarization [59.06663981902496]
Query-focused summarization (QFS) aims to produce summaries that answer particular questions of interest, enabling greater user control and personalization.
We investigate two indispensable characteristics that the LLMs-based QFS models should be harnessed, Lengthy Document Summarization and Efficiently Fine-grained Query-LLM Alignment.
These innovations pave the way for broader application and accessibility in the field of QFS technology.
arXiv Detail & Related papers (2024-07-15T07:14:56Z) - QontSum: On Contrasting Salient Content for Query-focused Summarization [22.738731393540633]
Query-focused summarization (QFS) is a challenging task in natural language processing that generates summaries to address specific queries.
This paper highlights the role of QFS in Grounded Answer Generation (GAR)
We propose QontSum, a novel approach for QFS that leverages contrastive learning to help the model attend to the most relevant regions of the input document.
arXiv Detail & Related papers (2023-07-14T19:25:35Z) - LMGQS: A Large-scale Dataset for Query-focused Summarization [77.6179359525065]
We convert four generic summarization benchmarks into a new QFS benchmark dataset, LMGQS.
We establish baselines with state-of-the-art summarization models.
We achieve state-of-the-art zero-shot and supervised performance on multiple existing QFS benchmarks.
arXiv Detail & Related papers (2023-05-22T14:53:45Z) - Query-Utterance Attention with Joint modeling for Query-Focused Meeting
Summarization [4.763356598070365]
We propose a query-aware framework with joint modeling token and utterance based on Query-Utterance Attention.
We show that the query relevance of different granularities contributes to generating a summary more related to the query.
arXiv Detail & Related papers (2023-03-08T10:21:45Z) - MQAG: Multiple-choice Question Answering and Generation for Assessing
Information Consistency in Summarization [55.60306377044225]
State-of-the-art summarization systems can generate highly fluent summaries.
These summaries, however, may contain factual inconsistencies and/or information not present in the source.
We introduce an alternative scheme based on standard information-theoretic measures in which the information present in the source and summary is directly compared.
arXiv Detail & Related papers (2023-01-28T23:08:25Z) - Abstractive Query Focused Summarization with Query-Free Resources [60.468323530248945]
In this work, we consider the problem of leveraging only generic summarization resources to build an abstractive QFS system.
We propose Marge, a Masked ROUGE Regression framework composed of a novel unified representation for summaries and queries.
Despite learning from minimal supervision, our system achieves state-of-the-art results in the distantly supervised setting.
arXiv Detail & Related papers (2020-12-29T14:39:35Z) - Query Focused Multi-Document Summarization with Distant Supervision [88.39032981994535]
Existing work relies heavily on retrieval-style methods for estimating the relevance between queries and text segments.
We propose a coarse-to-fine modeling framework which introduces separate modules for estimating whether segments are relevant to the query.
We demonstrate that our framework outperforms strong comparison systems on standard QFS benchmarks.
arXiv Detail & Related papers (2020-04-06T22:35:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.