OTTers: One-turn Topic Transitions for Open-Domain Dialogue
- URL: http://arxiv.org/abs/2105.13710v1
- Date: Fri, 28 May 2021 10:16:59 GMT
- Title: OTTers: One-turn Topic Transitions for Open-Domain Dialogue
- Authors: Karin Sevegnani, David M. Howcroft, Ioannis Konstas, Verena Rieser
- Abstract summary: Mixed initiative in open-domain dialogue requires a system to pro-actively introduce new topics.
One-turn topic transition task explores how a system connects two topics in a cooperative and coherent manner.
- Score: 11.305029351461306
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Mixed initiative in open-domain dialogue requires a system to pro-actively
introduce new topics. The one-turn topic transition task explores how a system
connects two topics in a cooperative and coherent manner. The goal of the task
is to generate a "bridging" utterance connecting the new topic to the topic of
the previous conversation turn. We are especially interested in commonsense
explanations of how a new topic relates to what has been mentioned before. We
first collect a new dataset of human one-turn topic transitions, which we call
OTTers. We then explore different strategies used by humans when asked to
complete such a task, and notice that the use of a bridging utterance to
connect the two topics is the approach used the most. We finally show how
existing state-of-the-art text generation models can be adapted to this task
and examine the performance of these baselines on different splits of the
OTTers data.
Related papers
- Multi-turn Dialogue Comprehension from a Topic-aware Perspective [70.37126956655985]
This paper proposes to model multi-turn dialogues from a topic-aware perspective.
We use a dialogue segmentation algorithm to split a dialogue passage into topic-concentrated fragments in an unsupervised way.
We also present a novel model, Topic-Aware Dual-Attention Matching (TADAM) Network, which takes topic segments as processing elements.
arXiv Detail & Related papers (2023-09-18T11:03:55Z) - Multi-Granularity Prompts for Topic Shift Detection in Dialogue [13.739991183173494]
The goal of dialogue topic shift detection is to identify whether the current topic in a conversation has changed or needs to change.
Previous work focused on detecting topic shifts using pre-trained models to encode the utterance.
We take a prompt-based approach to fully extract topic information from dialogues at multiple-granularity, i.e., label, turn, and topic.
arXiv Detail & Related papers (2023-05-23T12:35:49Z) - He Said, She Said: Style Transfer for Shifting the Perspective of
Dialogues [75.58367095888914]
We define a new style transfer task: perspective shift, which reframes a dialogue from informal first person to a formal third person rephrasing of the text.
As a sample application, we demonstrate that applying perspective shifting to a dialogue summarization dataset (SAMSum) substantially improves the zero-shot performance of extractive news summarization models.
arXiv Detail & Related papers (2022-10-27T14:16:07Z) - Sequential Topic Selection Model with Latent Variable for Topic-Grounded
Dialogue [21.1427816176227]
We propose a novel approach, named Sequential Global Topic Attention (SGTA) to exploit topic transition over all conversations.
Our model outperforms competitive baselines on prediction and generation tasks.
arXiv Detail & Related papers (2022-10-17T07:34:14Z) - Topic-Aware Contrastive Learning for Abstractive Dialogue Summarization [41.75442239197745]
This work proposes two topic-aware contrastive learning objectives, namely coherence detection and sub-summary generation objectives.
Experiments on benchmark datasets demonstrate that the proposed simple method significantly outperforms strong baselines.
arXiv Detail & Related papers (2021-09-10T17:03:25Z) - TIAGE: A Benchmark for Topic-Shift Aware Dialog Modeling [32.28415754809567]
TIAGE is a new topic-shift aware dialog benchmark constructed utilizing human annotations on topic shifts.
Based on TIAGE, we introduce three tasks to investigate different scenarios of topic-shift modeling in dialog settings.
arXiv Detail & Related papers (2021-09-09T21:06:12Z) - Topic-Oriented Spoken Dialogue Summarization for Customer Service with
Saliency-Aware Topic Modeling [61.67321200994117]
In a customer service system, dialogue summarization can boost service efficiency by creating summaries for long spoken dialogues.
In this work, we focus on topic-oriented dialogue summarization, which generates highly abstractive summaries.
We propose a novel topic-augmented two-stage dialogue summarizer ( TDS) jointly with a saliency-aware neural topic model (SATM) for topic-oriented summarization of customer service dialogues.
arXiv Detail & Related papers (2020-12-14T07:50:25Z) - Response Selection for Multi-Party Conversations with Dynamic Topic
Tracking [63.15158355071206]
We frame response selection as a dynamic topic tracking task to match the topic between the response and relevant conversation context.
We propose a novel multi-task learning framework that supports efficient encoding through large pretrained models.
Experimental results on the DSTC-8 Ubuntu IRC dataset show state-of-the-art results in response selection and topic disentanglement tasks.
arXiv Detail & Related papers (2020-10-15T14:21:38Z) - Multi-View Sequence-to-Sequence Models with Conversational Structure for
Abstractive Dialogue Summarization [72.54873655114844]
Text summarization is one of the most challenging and interesting problems in NLP.
This work proposes a multi-view sequence-to-sequence model by first extracting conversational structures of unstructured daily chats from different views to represent conversations.
Experiments on a large-scale dialogue summarization corpus demonstrated that our methods significantly outperformed previous state-of-the-art models via both automatic evaluations and human judgment.
arXiv Detail & Related papers (2020-10-04T20:12:44Z) - Topic-Aware Multi-turn Dialogue Modeling [91.52820664879432]
This paper presents a novel solution for multi-turn dialogue modeling, which segments and extracts topic-aware utterances in an unsupervised way.
Our topic-aware modeling is implemented by a newly proposed unsupervised topic-aware segmentation algorithm and Topic-Aware Dual-attention Matching (TADAM) Network.
arXiv Detail & Related papers (2020-09-26T08:43:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.