Structure-Aware Abstractive Conversation Summarization via Discourse and
Action Graphs
- URL: http://arxiv.org/abs/2104.08400v1
- Date: Fri, 16 Apr 2021 23:04:52 GMT
- Title: Structure-Aware Abstractive Conversation Summarization via Discourse and
Action Graphs
- Authors: Jiaao Chen, Diyi Yang
- Abstract summary: We propose to explicitly model the rich structures in conversations for more precise and accurate conversation summarization.
We incorporate discourse relations between utterances and action triples in utterances through structured graphs to better encode conversations.
Experiments show that our proposed models outperform state-of-the-art methods and generalize well in other domains.
- Score: 22.58861442978803
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Abstractive conversation summarization has received much attention recently.
However, these generated summaries often suffer from insufficient, redundant,
or incorrect content, largely due to the unstructured and complex
characteristics of human-human interactions. To this end, we propose to
explicitly model the rich structures in conversations for more precise and
accurate conversation summarization, by first incorporating discourse relations
between utterances and action triples ("who-doing-what") in utterances through
structured graphs to better encode conversations, and then designing a
multi-granularity decoder to generate summaries by combining all levels of
information. Experiments show that our proposed models outperform
state-of-the-art methods and generalize well in other domains in terms of both
automatic evaluations and human judgments. We have publicly released our code
at https://github.com/GT-SALT/Structure-Aware-BART.
Related papers
- Multi-party Response Generation with Relation Disentanglement [8.478506896774137]
Existing neural response generation models have achieved impressive improvements for two-party conversations.
However, many real-world dialogues involve multiple interlocutors and the structure of conversational context is much more complex.
We propose to automatically infer the relations via relational thinking on subtle clues inside the conversation context without any human label.
arXiv Detail & Related papers (2024-03-16T06:33:44Z) - Instructive Dialogue Summarization with Query Aggregations [41.89962538701501]
We introduce instruction-finetuned language models to expand the capability set of dialogue summarization models.
We propose a three-step approach to synthesize high-quality query-based summarization triples.
By training a unified model called InstructDS on three summarization datasets with multi-purpose instructive triples, we expand the capability of dialogue summarization models.
arXiv Detail & Related papers (2023-10-17T04:03:00Z) - Revisiting Conversation Discourse for Dialogue Disentanglement [88.3386821205896]
We propose enhancing dialogue disentanglement by taking full advantage of the dialogue discourse characteristics.
We develop a structure-aware framework to integrate the rich structural features for better modeling the conversational semantic context.
Our work has great potential to facilitate broader multi-party multi-thread dialogue applications.
arXiv Detail & Related papers (2023-06-06T19:17:47Z) - Enhancing Semantic Understanding with Self-supervised Methods for
Abstractive Dialogue Summarization [4.226093500082746]
We introduce self-supervised methods to compensate shortcomings to train a dialogue summarization model.
Our principle is to detect incoherent information flows using pretext dialogue text to enhance BERT's ability to contextualize the dialogue text representations.
arXiv Detail & Related papers (2022-09-01T07:51:46Z) - Unsupervised Learning of Hierarchical Conversation Structure [50.29889385593043]
Goal-oriented conversations often have meaningful sub-dialogue structure, but it can be highly domain-dependent.
This work introduces an unsupervised approach to learning hierarchical conversation structure, including turn and sub-dialogue segment labels.
The decoded structure is shown to be useful in enhancing neural models of language for three conversation-level understanding tasks.
arXiv Detail & Related papers (2022-05-24T17:52:34Z) - ConvoSumm: Conversation Summarization Benchmark and Improved Abstractive
Summarization with Argument Mining [61.82562838486632]
We crowdsource four new datasets on diverse online conversation forms of news comments, discussion forums, community question answering forums, and email threads.
We benchmark state-of-the-art models on our datasets and analyze characteristics associated with the data.
arXiv Detail & Related papers (2021-06-01T22:17:13Z) - Unsupervised Summarization for Chat Logs with Topic-Oriented Ranking and
Context-Aware Auto-Encoders [59.038157066874255]
We propose a novel framework called RankAE to perform chat summarization without employing manually labeled data.
RankAE consists of a topic-oriented ranking strategy that selects topic utterances according to centrality and diversity simultaneously.
A denoising auto-encoder is designed to generate succinct but context-informative summaries based on the selected utterances.
arXiv Detail & Related papers (2020-12-14T07:31:17Z) - Multi-View Sequence-to-Sequence Models with Conversational Structure for
Abstractive Dialogue Summarization [72.54873655114844]
Text summarization is one of the most challenging and interesting problems in NLP.
This work proposes a multi-view sequence-to-sequence model by first extracting conversational structures of unstructured daily chats from different views to represent conversations.
Experiments on a large-scale dialogue summarization corpus demonstrated that our methods significantly outperformed previous state-of-the-art models via both automatic evaluations and human judgment.
arXiv Detail & Related papers (2020-10-04T20:12:44Z) - Knowledge Graph-Augmented Abstractive Summarization with Semantic-Driven
Cloze Reward [42.925345819778656]
We present ASGARD, a novel framework for Abstractive Summarization with Graph-Augmentation and semantic-driven RewarD.
We propose the use of dual encoders---a sequential document encoder and a graph-structured encoder---to maintain the global context and local characteristics of entities.
Results show that our models produce significantly higher ROUGE scores than a variant without knowledge graph as input on both New York Times and CNN/Daily Mail datasets.
arXiv Detail & Related papers (2020-05-03T18:23:06Z) - Structure-Augmented Text Representation Learning for Efficient Knowledge
Graph Completion [53.31911669146451]
Human-curated knowledge graphs provide critical supportive information to various natural language processing tasks.
These graphs are usually incomplete, urging auto-completion of them.
graph embedding approaches, e.g., TransE, learn structured knowledge via representing graph elements into dense embeddings.
textual encoding approaches, e.g., KG-BERT, resort to graph triple's text and triple-level contextualized representations.
arXiv Detail & Related papers (2020-04-30T13:50:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.