Using Domain Knowledge to Guide Dialog Structure Induction via Neural Probabilistic Soft Logic
- URL: http://arxiv.org/abs/2403.17853v1
- Date: Tue, 26 Mar 2024 16:42:30 GMT
- Title: Using Domain Knowledge to Guide Dialog Structure Induction via Neural Probabilistic Soft Logic
- Authors: Connor Pryor, Quan Yuan, Jeremiah Liu, Mehran Kazemi, Deepak Ramachandran, Tania Bedrax-Weiss, Lise Getoor,
- Abstract summary: Dialog Structure Induction (DSI) is the task of inferring the latent dialog structure of a given goal-oriented dialog.
Existing DSI approaches are often purely data-driven, deploy models that infer latent states without access to domain knowledge.
We introduce Neural Probabilistic Soft Logic Dialogue Structure Induction (NEUPSL DSI), a principled approach that injects symbolic knowledge into the latent space of a generative neural model.
- Score: 21.3531538363406
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Dialog Structure Induction (DSI) is the task of inferring the latent dialog structure (i.e., a set of dialog states and their temporal transitions) of a given goal-oriented dialog. It is a critical component for modern dialog system design and discourse analysis. Existing DSI approaches are often purely data-driven, deploy models that infer latent states without access to domain knowledge, underperform when the training corpus is limited/noisy, or have difficulty when test dialogs exhibit distributional shifts from the training domain. This work explores a neural-symbolic approach as a potential solution to these problems. We introduce Neural Probabilistic Soft Logic Dialogue Structure Induction (NEUPSL DSI), a principled approach that injects symbolic knowledge into the latent space of a generative neural model. We conduct a thorough empirical investigation on the effect of NEUPSL DSI learning on hidden representation quality, few-shot learning, and out-of-domain generalization performance. Over three dialog structure induction datasets and across unsupervised and semi-supervised settings for standard and cross-domain generalization, the injection of symbolic knowledge using NEUPSL DSI provides a consistent boost in performance over the canonical baselines.
Related papers
- Injecting linguistic knowledge into BERT for Dialogue State Tracking [60.42231674887294]
This paper proposes a method that extracts linguistic knowledge via an unsupervised framework.
We then utilize this knowledge to augment BERT's performance and interpretability in Dialogue State Tracking (DST) tasks.
We benchmark this framework on various DST tasks and observe a notable improvement in accuracy.
arXiv Detail & Related papers (2023-11-27T08:38:42Z) - Emotion Recognition in Conversation using Probabilistic Soft Logic [17.62924003652853]
emotion recognition in conversation (ERC) is a sub-field of emotion recognition that focuses on conversations that contain two or more utterances.
We implement our approach in a framework called Probabilistic Soft Logic (PSL), a declarative templating language.
PSL provides functionality for the incorporation of results from neural models into PSL models.
We compare our method with state-of-the-art purely neural ERC systems, and see almost a 20% improvement.
arXiv Detail & Related papers (2022-07-14T23:59:06Z) - Structure Extraction in Task-Oriented Dialogues with Slot Clustering [94.27806592467537]
In task-oriented dialogues, dialogue structure has often been considered as transition graphs among dialogue states.
We propose a simple yet effective approach for structure extraction in task-oriented dialogues.
arXiv Detail & Related papers (2022-02-28T20:18:12Z) - Every time I fire a conversational designer, the performance of the
dialog system goes down [0.07696728525672149]
We investigate how the use of explicit domain knowledge of conversational designers affects the performance of neural-based dialogue systems.
We propose the Conversational-Logic-Injection-in-Neural-Network system (CLINN) where explicit knowledge is coded in semi-logical rules.
arXiv Detail & Related papers (2021-09-27T13:05:31Z) - Preliminary study on using vector quantization latent spaces for TTS/VC
systems with consistent performance [55.10864476206503]
We investigate the use of quantized vectors to model the latent linguistic embedding.
By enforcing different policies over the latent spaces in the training, we are able to obtain a latent linguistic embedding.
Our experiments show that the voice cloning system built with vector quantization has only a small degradation in terms of perceptive evaluations.
arXiv Detail & Related papers (2021-06-25T07:51:35Z) - Discovering Dialog Structure Graph for Open-Domain Dialog Generation [51.29286279366361]
We conduct unsupervised discovery of dialog structure from chitchat corpora.
We then leverage it to facilitate dialog generation in downstream systems.
We present a Discrete Variational Auto-Encoder with Graph Neural Network (DVAE-GNN), to discover a unified human-readable dialog structure.
arXiv Detail & Related papers (2020-12-31T10:58:37Z) - Collaboratively boosting data-driven deep learning and knowledge-guided
ontological reasoning for semantic segmentation of remote sensing imagery [2.342488890032597]
DSSN can be trained by an end-to-end mechanism and competent for employing the low-level and mid-level cues.
Human beings have an excellent inference capacity and can be able to reliably interpret the RS imagery.
This paper proposes a collaboratively boosting framework (CBF) to combine data-driven deep learning module and knowledge-guided ontological reasoning module.
arXiv Detail & Related papers (2020-10-06T03:32:17Z) - Structured Attention for Unsupervised Dialogue Structure Induction [110.12561786644122]
We propose to incorporate structured attention layers into a Variational Recurrent Neural Network (VRNN) model with discrete latent states to learn dialogue structure in an unsupervised fashion.
Compared to a vanilla VRNN, structured attention enables a model to focus on different parts of the source sentence embeddings while enforcing a structural inductive bias.
arXiv Detail & Related papers (2020-09-17T23:07:03Z) - Modelling Hierarchical Structure between Dialogue Policy and Natural
Language Generator with Option Framework for Task-oriented Dialogue System [49.39150449455407]
HDNO is an option framework for designing latent dialogue acts to avoid designing specific dialogue act representations.
We test HDNO on MultiWoz 2.0 and MultiWoz 2.1, the datasets on multi-domain dialogues, in comparison with word-level E2E model trained with RL, LaRL and HDSA.
arXiv Detail & Related papers (2020-06-11T20:55:28Z) - Probing Neural Dialog Models for Conversational Understanding [21.76744391202041]
We analyze the internal representations learned by neural open-domain dialog systems.
Our results suggest that standard open-domain dialog systems struggle with answering questions.
We also find that the dyadic, turn-taking nature of dialog is not fully leveraged by these models.
arXiv Detail & Related papers (2020-06-07T17:32:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.