Unsupervised Learning of Hierarchical Conversation Structure
- URL: http://arxiv.org/abs/2205.12244v1
- Date: Tue, 24 May 2022 17:52:34 GMT
- Title: Unsupervised Learning of Hierarchical Conversation Structure
- Authors: Bo-Ru Lu, Yushi Hu, Hao Cheng, Noah A. Smith, Mari Ostendorf
- Abstract summary: Goal-oriented conversations often have meaningful sub-dialogue structure, but it can be highly domain-dependent.
This work introduces an unsupervised approach to learning hierarchical conversation structure, including turn and sub-dialogue segment labels.
The decoded structure is shown to be useful in enhancing neural models of language for three conversation-level understanding tasks.
- Score: 50.29889385593043
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Human conversations can evolve in many different ways, creating challenges
for automatic understanding and summarization. Goal-oriented conversations
often have meaningful sub-dialogue structure, but it can be highly
domain-dependent. This work introduces an unsupervised approach to learning
hierarchical conversation structure, including turn and sub-dialogue segment
labels, corresponding roughly to dialogue acts and sub-tasks, respectively. The
decoded structure is shown to be useful in enhancing neural models of language
for three conversation-level understanding tasks. Further, the learned
finite-state sub-dialogue network is made interpretable through automatic
summarization. Our code and trained models are available at
\url{https://github.com/boru-roylu/THETA}.
Related papers
- Unsupervised Mutual Learning of Dialogue Discourse Parsing and Topic Segmentation [38.956438905614256]
rhetorical structure and topic structure are mostly modeled separately or with one assisting the other in the prior work.
We propose an unsupervised mutual learning framework of two structures leveraging the global and local connections between them.
We also incorporate rhetorical structures into the topic structure through a graph neural network model to ensure local coherence consistency.
arXiv Detail & Related papers (2024-05-30T08:10:50Z) - Revisiting Conversation Discourse for Dialogue Disentanglement [88.3386821205896]
We propose enhancing dialogue disentanglement by taking full advantage of the dialogue discourse characteristics.
We develop a structure-aware framework to integrate the rich structural features for better modeling the conversational semantic context.
Our work has great potential to facilitate broader multi-party multi-thread dialogue applications.
arXiv Detail & Related papers (2023-06-06T19:17:47Z) - Uncovering the Potential of ChatGPT for Discourse Analysis in Dialogue:
An Empirical Study [51.079100495163736]
This paper systematically inspects ChatGPT's performance in two discourse analysis tasks: topic segmentation and discourse parsing.
ChatGPT demonstrates proficiency in identifying topic structures in general-domain conversations yet struggles considerably in specific-domain conversations.
Our deeper investigation indicates that ChatGPT can give more reasonable topic structures than human annotations but only linearly parses the hierarchical rhetorical structures.
arXiv Detail & Related papers (2023-05-15T07:14:41Z) - CTRLStruct: Dialogue Structure Learning for Open-Domain Response
Generation [38.60073402817218]
Well-structured topic flow can leverage background information and predict future topics to help generate controllable and explainable responses.
We present a new framework for dialogue structure learning to effectively explore topic-level dialogue clusters as well as their transitions with unlabelled information.
Experiments on two popular open-domain dialogue datasets show our model can generate more coherent responses compared to some excellent dialogue models.
arXiv Detail & Related papers (2023-03-02T09:27:11Z) - Structure Extraction in Task-Oriented Dialogues with Slot Clustering [94.27806592467537]
In task-oriented dialogues, dialogue structure has often been considered as transition graphs among dialogue states.
We propose a simple yet effective approach for structure extraction in task-oriented dialogues.
arXiv Detail & Related papers (2022-02-28T20:18:12Z) - The Hierarchical Organization of Syntax [0.0]
We analyze the hierarchical organization of historical syntactic networks to understand how syntax evolves over time.
We created these networks from a corpus of German texts from the 11th to 17th centuries.
We named these syntactic structures "syntactic communicative hierarchies"
arXiv Detail & Related papers (2021-11-27T00:47:54Z) - Multi-View Sequence-to-Sequence Models with Conversational Structure for
Abstractive Dialogue Summarization [72.54873655114844]
Text summarization is one of the most challenging and interesting problems in NLP.
This work proposes a multi-view sequence-to-sequence model by first extracting conversational structures of unstructured daily chats from different views to represent conversations.
Experiments on a large-scale dialogue summarization corpus demonstrated that our methods significantly outperformed previous state-of-the-art models via both automatic evaluations and human judgment.
arXiv Detail & Related papers (2020-10-04T20:12:44Z) - Structured Attention for Unsupervised Dialogue Structure Induction [110.12561786644122]
We propose to incorporate structured attention layers into a Variational Recurrent Neural Network (VRNN) model with discrete latent states to learn dialogue structure in an unsupervised fashion.
Compared to a vanilla VRNN, structured attention enables a model to focus on different parts of the source sentence embeddings while enforcing a structural inductive bias.
arXiv Detail & Related papers (2020-09-17T23:07:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.