Sequential Topic Selection Model with Latent Variable for Topic-Grounded
Dialogue
- URL: http://arxiv.org/abs/2210.08801v1
- Date: Mon, 17 Oct 2022 07:34:14 GMT
- Title: Sequential Topic Selection Model with Latent Variable for Topic-Grounded
Dialogue
- Authors: Xiaofei Wen, Wei Wei and Xian-Ling Mao
- Abstract summary: We propose a novel approach, named Sequential Global Topic Attention (SGTA) to exploit topic transition over all conversations.
Our model outperforms competitive baselines on prediction and generation tasks.
- Score: 21.1427816176227
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, topic-grounded dialogue system has attracted significant attention
due to its effectiveness in predicting the next topic to yield better responses
via the historical context and given topic sequence. However, almost all
existing topic prediction solutions focus on only the current conversation and
corresponding topic sequence to predict the next conversation topic, without
exploiting other topic-guided conversations which may contain relevant
topic-transitions to current conversation. To address the problem, in this
paper we propose a novel approach, named Sequential Global Topic Attention
(SGTA) to exploit topic transition over all conversations in a subtle way for
better modeling post-to-response topic-transition and guiding the response
generation to the current conversation. Specifically, we introduce a latent
space modeled as a Multivariate Skew-Normal distribution with hybrid kernel
functions to flexibly integrate the global-level information with
sequence-level information, and predict the topic based on the distribution
sampling results. We also leverage a topic-aware prior-posterior approach for
secondary selection of predicted topics, which is utilized to optimize the
response generation task. Extensive experiments demonstrate that our model
outperforms competitive baselines on prediction and generation tasks.
Related papers
- Bundle Fragments into a Whole: Mining More Complete Clusters via Submodular Selection of Interesting webpages for Web Topic Detection [49.8035161337388]
A state-of-the-art solution is firstly to organize webpages into a large volume of multi-granularity topic candidates.
Hot topics are further identified by estimating their interestingness.
This paper proposes a bundling-refining approach to mine more complete hot topics from fragments.
arXiv Detail & Related papers (2024-09-19T00:46:31Z) - Personalized Topic Selection Model for Topic-Grounded Dialogue [24.74527189182273]
Current models tend to predict user-uninteresting and contextually irrelevant topics.
We propose a textbfPersonalized topic stextbfElection model for textbfTopic-grounded textbfDialogue, named textbfPETD.
Our proposed method can generate engaging and diverse responses, outperforming state-of-the-art baselines.
arXiv Detail & Related papers (2024-06-04T06:09:49Z) - Multi-Granularity Prompts for Topic Shift Detection in Dialogue [13.739991183173494]
The goal of dialogue topic shift detection is to identify whether the current topic in a conversation has changed or needs to change.
Previous work focused on detecting topic shifts using pre-trained models to encode the utterance.
We take a prompt-based approach to fully extract topic information from dialogues at multiple-granularity, i.e., label, turn, and topic.
arXiv Detail & Related papers (2023-05-23T12:35:49Z) - TopicRefine: Joint Topic Prediction and Dialogue Response Generation for
Multi-turn End-to-End Dialogue System [12.135300607779753]
A multi-turn dialogue always follows a specific topic thread, and topic shift at the discourse level occurs naturally.
Previous research has either predicted the topic first and then generated the relevant response, or simply applied the attention mechanism to all topics.
We propose a joint framework with a topic refinement mechanism to learn these two tasks simultaneously.
arXiv Detail & Related papers (2021-09-11T04:43:07Z) - Unsupervised Summarization for Chat Logs with Topic-Oriented Ranking and
Context-Aware Auto-Encoders [59.038157066874255]
We propose a novel framework called RankAE to perform chat summarization without employing manually labeled data.
RankAE consists of a topic-oriented ranking strategy that selects topic utterances according to centrality and diversity simultaneously.
A denoising auto-encoder is designed to generate succinct but context-informative summaries based on the selected utterances.
arXiv Detail & Related papers (2020-12-14T07:31:17Z) - Response Selection for Multi-Party Conversations with Dynamic Topic
Tracking [63.15158355071206]
We frame response selection as a dynamic topic tracking task to match the topic between the response and relevant conversation context.
We propose a novel multi-task learning framework that supports efficient encoding through large pretrained models.
Experimental results on the DSTC-8 Ubuntu IRC dataset show state-of-the-art results in response selection and topic disentanglement tasks.
arXiv Detail & Related papers (2020-10-15T14:21:38Z) - Multi-View Sequence-to-Sequence Models with Conversational Structure for
Abstractive Dialogue Summarization [72.54873655114844]
Text summarization is one of the most challenging and interesting problems in NLP.
This work proposes a multi-view sequence-to-sequence model by first extracting conversational structures of unstructured daily chats from different views to represent conversations.
Experiments on a large-scale dialogue summarization corpus demonstrated that our methods significantly outperformed previous state-of-the-art models via both automatic evaluations and human judgment.
arXiv Detail & Related papers (2020-10-04T20:12:44Z) - Modeling Topical Relevance for Multi-Turn Dialogue Generation [61.87165077442267]
We propose a new model, named STAR-BTM, to tackle the problem of topic drift in multi-turn dialogue.
The Biterm Topic Model is pre-trained on the whole training dataset. Then, the topic level attention weights are computed based on the topic representation of each context.
Experimental results on both Chinese customer services data and English Ubuntu dialogue data show that STAR-BTM significantly outperforms several state-of-the-art methods.
arXiv Detail & Related papers (2020-09-27T03:33:22Z) - Topic-Aware Multi-turn Dialogue Modeling [91.52820664879432]
This paper presents a novel solution for multi-turn dialogue modeling, which segments and extracts topic-aware utterances in an unsupervised way.
Our topic-aware modeling is implemented by a newly proposed unsupervised topic-aware segmentation algorithm and Topic-Aware Dual-attention Matching (TADAM) Network.
arXiv Detail & Related papers (2020-09-26T08:43:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.