Multi-View Sequence-to-Sequence Models with Conversational Structure for
Abstractive Dialogue Summarization
- URL: http://arxiv.org/abs/2010.01672v1
- Date: Sun, 4 Oct 2020 20:12:44 GMT
- Title: Multi-View Sequence-to-Sequence Models with Conversational Structure for
Abstractive Dialogue Summarization
- Authors: Jiaao Chen, Diyi Yang
- Abstract summary: Text summarization is one of the most challenging and interesting problems in NLP.
This work proposes a multi-view sequence-to-sequence model by first extracting conversational structures of unstructured daily chats from different views to represent conversations.
Experiments on a large-scale dialogue summarization corpus demonstrated that our methods significantly outperformed previous state-of-the-art models via both automatic evaluations and human judgment.
- Score: 72.54873655114844
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Text summarization is one of the most challenging and interesting problems in
NLP. Although much attention has been paid to summarizing structured text like
news reports or encyclopedia articles, summarizing conversations---an essential
part of human-human/machine interaction where most important pieces of
information are scattered across various utterances of different
speakers---remains relatively under-investigated. This work proposes a
multi-view sequence-to-sequence model by first extracting conversational
structures of unstructured daily chats from different views to represent
conversations and then utilizing a multi-view decoder to incorporate different
views to generate dialogue summaries. Experiments on a large-scale dialogue
summarization corpus demonstrated that our methods significantly outperformed
previous state-of-the-art models via both automatic evaluations and human
judgment. We also discussed specific challenges that current approaches faced
with this task. We have publicly released our code at
https://github.com/GT-SALT/Multi-View-Seq2Seq.
Related papers
- Multi-turn Dialogue Comprehension from a Topic-aware Perspective [70.37126956655985]
This paper proposes to model multi-turn dialogues from a topic-aware perspective.
We use a dialogue segmentation algorithm to split a dialogue passage into topic-concentrated fragments in an unsupervised way.
We also present a novel model, Topic-Aware Dual-Attention Matching (TADAM) Network, which takes topic segments as processing elements.
arXiv Detail & Related papers (2023-09-18T11:03:55Z) - DIONYSUS: A Pre-trained Model for Low-Resource Dialogue Summarization [127.714919036388]
DIONYSUS is a pre-trained encoder-decoder model for summarizing dialogues in any new domain.
Our experiments show that DIONYSUS outperforms existing methods on six datasets.
arXiv Detail & Related papers (2022-12-20T06:21:21Z) - Who says like a style of Vitamin: Towards Syntax-Aware
DialogueSummarization using Multi-task Learning [2.251583286448503]
We focus on the association between utterances from individual speakers and unique syntactic structures.
Speakers have unique textual styles that can contain linguistic information, such as voiceprint.
We employ multi-task learning of both syntax-aware information and dialogue summarization.
arXiv Detail & Related papers (2021-09-29T05:30:39Z) - Topic-Aware Contrastive Learning for Abstractive Dialogue Summarization [41.75442239197745]
This work proposes two topic-aware contrastive learning objectives, namely coherence detection and sub-summary generation objectives.
Experiments on benchmark datasets demonstrate that the proposed simple method significantly outperforms strong baselines.
arXiv Detail & Related papers (2021-09-10T17:03:25Z) - Structure-Aware Abstractive Conversation Summarization via Discourse and
Action Graphs [22.58861442978803]
We propose to explicitly model the rich structures in conversations for more precise and accurate conversation summarization.
We incorporate discourse relations between utterances and action triples in utterances through structured graphs to better encode conversations.
Experiments show that our proposed models outperform state-of-the-art methods and generalize well in other domains.
arXiv Detail & Related papers (2021-04-16T23:04:52Z) - Unsupervised Summarization for Chat Logs with Topic-Oriented Ranking and
Context-Aware Auto-Encoders [59.038157066874255]
We propose a novel framework called RankAE to perform chat summarization without employing manually labeled data.
RankAE consists of a topic-oriented ranking strategy that selects topic utterances according to centrality and diversity simultaneously.
A denoising auto-encoder is designed to generate succinct but context-informative summaries based on the selected utterances.
arXiv Detail & Related papers (2020-12-14T07:31:17Z) - Topic-Aware Multi-turn Dialogue Modeling [91.52820664879432]
This paper presents a novel solution for multi-turn dialogue modeling, which segments and extracts topic-aware utterances in an unsupervised way.
Our topic-aware modeling is implemented by a newly proposed unsupervised topic-aware segmentation algorithm and Topic-Aware Dual-attention Matching (TADAM) Network.
arXiv Detail & Related papers (2020-09-26T08:43:06Z) - Diversifying Dialogue Generation with Non-Conversational Text [38.03510529185192]
We propose a new perspective to diversify dialogue generation by leveraging non-conversational text.
We collect a large-scale non-conversational corpus from multi sources including forum comments, idioms and book snippets.
The resulting model is tested on two conversational datasets and is shown to produce significantly more diverse responses without sacrificing the relevance with context.
arXiv Detail & Related papers (2020-05-09T02:16:05Z) - The Shmoop Corpus: A Dataset of Stories with Loosely Aligned Summaries [72.48439126769627]
We introduce the Shmoop Corpus: a dataset of 231 stories paired with detailed multi-paragraph summaries for each individual chapter.
From the corpus, we construct a set of common NLP tasks, including Cloze-form question answering and a simplified form of abstractive summarization.
We believe that the unique structure of this corpus provides an important foothold towards making machine story comprehension more approachable.
arXiv Detail & Related papers (2019-12-30T21:03:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.