Enhancing Dialogue Summarization with Topic-Aware Global- and Local-
Level Centrality
- URL: http://arxiv.org/abs/2301.12376v1
- Date: Sun, 29 Jan 2023 06:41:55 GMT
- Title: Enhancing Dialogue Summarization with Topic-Aware Global- and Local-
Level Centrality
- Authors: Xinnian Liang, Shuangzhi Wu, Chenhao Cui, Jiaqi Bai, Chao Bian,
Zhoujun Li
- Abstract summary: We propose a novel topic-aware Global-Local Centrality (GLC) model to help select the salient context from all sub-topics.
The global one aims to identify vital sub-topics in the dialogue and the local one aims to select the most important context in each sub-topic.
Experimental results show that our model outperforms strong baselines on three public dialogue summarization datasets.
- Score: 24.838387172698543
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Dialogue summarization aims to condense a given dialogue into a simple and
focused summary text. Typically, both the roles' viewpoints and conversational
topics change in the dialogue stream. Thus how to effectively handle the
shifting topics and select the most salient utterance becomes one of the major
challenges of this task. In this paper, we propose a novel topic-aware
Global-Local Centrality (GLC) model to help select the salient context from all
sub-topics. The centralities are constructed at both the global and local
levels. The global one aims to identify vital sub-topics in the dialogue and
the local one aims to select the most important context in each sub-topic.
Specifically, the GLC collects sub-topic based on the utterance
representations. And each utterance is aligned with one sub-topic. Based on the
sub-topics, the GLC calculates global- and local-level centralities. Finally,
we combine the two to guide the model to capture both salient context and
sub-topics when generating summaries. Experimental results show that our model
outperforms strong baselines on three public dialogue summarization datasets:
CSDS, MC, and SAMSUM. Further analysis demonstrates that our GLC can exactly
identify vital contents from
sub-topics.~\footnote{\url{https://github.com/xnliang98/bart-glc}}
Related papers
- Local and Global Contexts for Conversation [2.566915473185134]
We introduce a local and global conversation model (LGCM) for general-purpose conversation in open domain.
It is a local-global hierarchical transformer model that excels at accurately discerning and assimilating the relevant contexts.
It employs a local encoder to grasp the local context at the level of individual utterances and a global encoder to understand the broader context at the dialogue level.
arXiv Detail & Related papers (2024-01-31T04:19:22Z) - Multi-turn Dialogue Comprehension from a Topic-aware Perspective [70.37126956655985]
This paper proposes to model multi-turn dialogues from a topic-aware perspective.
We use a dialogue segmentation algorithm to split a dialogue passage into topic-concentrated fragments in an unsupervised way.
We also present a novel model, Topic-Aware Dual-Attention Matching (TADAM) Network, which takes topic segments as processing elements.
arXiv Detail & Related papers (2023-09-18T11:03:55Z) - Multi-Granularity Prompts for Topic Shift Detection in Dialogue [13.739991183173494]
The goal of dialogue topic shift detection is to identify whether the current topic in a conversation has changed or needs to change.
Previous work focused on detecting topic shifts using pre-trained models to encode the utterance.
We take a prompt-based approach to fully extract topic information from dialogues at multiple-granularity, i.e., label, turn, and topic.
arXiv Detail & Related papers (2023-05-23T12:35:49Z) - Sequential Topic Selection Model with Latent Variable for Topic-Grounded
Dialogue [21.1427816176227]
We propose a novel approach, named Sequential Global Topic Attention (SGTA) to exploit topic transition over all conversations.
Our model outperforms competitive baselines on prediction and generation tasks.
arXiv Detail & Related papers (2022-10-17T07:34:14Z) - Coalescing Global and Local Information for Procedural Text
Understanding [70.10291759879887]
A complete procedural understanding solution should combine three core aspects: local and global views of the inputs, and global view of outputs.
In this paper, we propose Coalescing Global and Local InformationCG, a new model that builds entity and time representations.
Experiments on a popular procedural text understanding dataset show that our model achieves state-of-the-art results.
arXiv Detail & Related papers (2022-08-26T19:16:32Z) - Topic-Oriented Spoken Dialogue Summarization for Customer Service with
Saliency-Aware Topic Modeling [61.67321200994117]
In a customer service system, dialogue summarization can boost service efficiency by creating summaries for long spoken dialogues.
In this work, we focus on topic-oriented dialogue summarization, which generates highly abstractive summaries.
We propose a novel topic-augmented two-stage dialogue summarizer ( TDS) jointly with a saliency-aware neural topic model (SATM) for topic-oriented summarization of customer service dialogues.
arXiv Detail & Related papers (2020-12-14T07:50:25Z) - Multi-View Sequence-to-Sequence Models with Conversational Structure for
Abstractive Dialogue Summarization [72.54873655114844]
Text summarization is one of the most challenging and interesting problems in NLP.
This work proposes a multi-view sequence-to-sequence model by first extracting conversational structures of unstructured daily chats from different views to represent conversations.
Experiments on a large-scale dialogue summarization corpus demonstrated that our methods significantly outperformed previous state-of-the-art models via both automatic evaluations and human judgment.
arXiv Detail & Related papers (2020-10-04T20:12:44Z) - Modeling Topical Relevance for Multi-Turn Dialogue Generation [61.87165077442267]
We propose a new model, named STAR-BTM, to tackle the problem of topic drift in multi-turn dialogue.
The Biterm Topic Model is pre-trained on the whole training dataset. Then, the topic level attention weights are computed based on the topic representation of each context.
Experimental results on both Chinese customer services data and English Ubuntu dialogue data show that STAR-BTM significantly outperforms several state-of-the-art methods.
arXiv Detail & Related papers (2020-09-27T03:33:22Z) - Topic-Aware Multi-turn Dialogue Modeling [91.52820664879432]
This paper presents a novel solution for multi-turn dialogue modeling, which segments and extracts topic-aware utterances in an unsupervised way.
Our topic-aware modeling is implemented by a newly proposed unsupervised topic-aware segmentation algorithm and Topic-Aware Dual-attention Matching (TADAM) Network.
arXiv Detail & Related papers (2020-09-26T08:43:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.