DSBERT:Unsupervised Dialogue Structure learning with BERT
- URL: http://arxiv.org/abs/2111.04933v1
- Date: Tue, 9 Nov 2021 03:31:18 GMT
- Title: DSBERT:Unsupervised Dialogue Structure learning with BERT
- Authors: Bingkun Chen, Shaobing Dai, Shenghua Zheng, Lei Liao, Yang Li
- Abstract summary: We propose a Bert-based unsupervised dialogue structure learning algorithm DSBERT (Dialogue Structure BERT)
Different from the previous SOTA models VRNN and SVRNN, we combine BERT and AutoEncoder, which can effectively combine context information.
Experimental results show that DSBERT can generate a dialogue structure closer to the real structure, can distinguish sentences with different semantics and map them to different hidden states.
- Score: 4.171523157658394
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Unsupervised dialogue structure learning is an important and meaningful task
in natural language processing. The extracted dialogue structure and process
can help analyze human dialogue, and play a vital role in the design and
evaluation of dialogue systems. The traditional dialogue system requires
experts to manually design the dialogue structure, which is very costly. But
through unsupervised dialogue structure learning, dialogue structure can be
automatically obtained, reducing the cost of developers constructing dialogue
process. The learned dialogue structure can be used to promote the dialogue
generation of the downstream task system, and improve the logic and consistency
of the dialogue robot's reply.In this paper, we propose a Bert-based
unsupervised dialogue structure learning algorithm DSBERT (Dialogue Structure
BERT). Different from the previous SOTA models VRNN and SVRNN, we combine BERT
and AutoEncoder, which can effectively combine context information. In order to
better prevent the model from falling into the local optimal solution and make
the dialogue state distribution more uniform and reasonable, we also propose
three balanced loss functions that can be used for dialogue structure learning.
Experimental results show that DSBERT can generate a dialogue structure closer
to the real structure, can distinguish sentences with different semantics and
map them to different hidden states.
Related papers
- Revisiting Conversation Discourse for Dialogue Disentanglement [88.3386821205896]
We propose enhancing dialogue disentanglement by taking full advantage of the dialogue discourse characteristics.
We develop a structure-aware framework to integrate the rich structural features for better modeling the conversational semantic context.
Our work has great potential to facilitate broader multi-party multi-thread dialogue applications.
arXiv Detail & Related papers (2023-06-06T19:17:47Z) - CTRLStruct: Dialogue Structure Learning for Open-Domain Response
Generation [38.60073402817218]
Well-structured topic flow can leverage background information and predict future topics to help generate controllable and explainable responses.
We present a new framework for dialogue structure learning to effectively explore topic-level dialogue clusters as well as their transitions with unlabelled information.
Experiments on two popular open-domain dialogue datasets show our model can generate more coherent responses compared to some excellent dialogue models.
arXiv Detail & Related papers (2023-03-02T09:27:11Z) - STRUDEL: Structured Dialogue Summarization for Dialogue Comprehension [42.57581945778631]
Abstractive dialogue summarization has long been viewed as an important standalone task in natural language processing.
We propose a novel type of dialogue summarization task - STRUctured DiaLoguE Summarization.
We show that our STRUDEL dialogue comprehension model can significantly improve the dialogue comprehension performance of transformer encoder language models.
arXiv Detail & Related papers (2022-12-24T04:39:54Z) - Structure Extraction in Task-Oriented Dialogues with Slot Clustering [94.27806592467537]
In task-oriented dialogues, dialogue structure has often been considered as transition graphs among dialogue states.
We propose a simple yet effective approach for structure extraction in task-oriented dialogues.
arXiv Detail & Related papers (2022-02-28T20:18:12Z) - UniDS: A Unified Dialogue System for Chit-Chat and Task-oriented
Dialogues [59.499965460525694]
We propose a unified dialogue system (UniDS) with the two aforementioned skills.
We design a unified dialogue data schema, compatible for both chit-chat and task-oriented dialogues.
We train UniDS with mixed dialogue data from a pretrained chit-chat dialogue model.
arXiv Detail & Related papers (2021-10-15T11:56:47Z) - Structural Modeling for Dialogue Disentanglement [43.352833140317486]
Multi-party dialogue context Tangled multi-party dialogue context leads to challenges for dialogue reading comprehension.
This work designs a novel model to disentangle multi-party history into threads, by taking dialogue structure features into account.
arXiv Detail & Related papers (2021-10-15T11:28:43Z) - Structured Attention for Unsupervised Dialogue Structure Induction [110.12561786644122]
We propose to incorporate structured attention layers into a Variational Recurrent Neural Network (VRNN) model with discrete latent states to learn dialogue structure in an unsupervised fashion.
Compared to a vanilla VRNN, structured attention enables a model to focus on different parts of the source sentence embeddings while enforcing a structural inductive bias.
arXiv Detail & Related papers (2020-09-17T23:07:03Z) - TOD-BERT: Pre-trained Natural Language Understanding for Task-Oriented
Dialogue [113.45485470103762]
In this work, we unify nine human-human and multi-turn task-oriented dialogue datasets for language modeling.
To better model dialogue behavior during pre-training, we incorporate user and system tokens into the masked language modeling.
arXiv Detail & Related papers (2020-04-15T04:09:05Z) - Conversation Learner -- A Machine Teaching Tool for Building Dialog
Managers for Task-Oriented Dialog Systems [57.082447660944965]
Conversation Learner is a machine teaching tool for building dialog managers.
It enables dialog authors to create a dialog flow using familiar tools, converting the dialog flow into a parametric model.
It allows dialog authors to improve the dialog manager over time by leveraging user-system dialog logs as training data.
arXiv Detail & Related papers (2020-04-09T00:10:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.