A Bipartite Graph is All We Need for Enhancing Emotional Reasoning with
Commonsense Knowledge
- URL: http://arxiv.org/abs/2308.04811v1
- Date: Wed, 9 Aug 2023 09:09:17 GMT
- Title: A Bipartite Graph is All We Need for Enhancing Emotional Reasoning with
Commonsense Knowledge
- Authors: Kailai Yang, Tianlin Zhang, Shaoxiong Ji, Sophia Ananiadou
- Abstract summary: We propose a Bipartite Heterogeneous Graph (BHG) method for enhancing emotional reasoning with commonsense knowledge.
BHG-based knowledge infusion can be directly generalized to multi-type and multi-grained knowledge sources.
- Score: 16.410940528107115
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The context-aware emotional reasoning ability of AI systems, especially in
conversations, is of vital importance in applications such as online opinion
mining from social media and empathetic dialogue systems. Due to the implicit
nature of conveying emotions in many scenarios, commonsense knowledge is widely
utilized to enrich utterance semantics and enhance conversation modeling.
However, most previous knowledge infusion methods perform empirical knowledge
filtering and design highly customized architectures for knowledge interaction
with the utterances, which can discard useful knowledge aspects and limit their
generalizability to different knowledge sources. Based on these observations,
we propose a Bipartite Heterogeneous Graph (BHG) method for enhancing emotional
reasoning with commonsense knowledge. In BHG, the extracted context-aware
utterance representations and knowledge representations are modeled as
heterogeneous nodes. Two more knowledge aggregation node types are proposed to
perform automatic knowledge filtering and interaction. BHG-based knowledge
infusion can be directly generalized to multi-type and multi-grained knowledge
sources. In addition, we propose a Multi-dimensional Heterogeneous Graph
Transformer (MHGT) to perform graph reasoning, which can retain unchanged
feature spaces and unequal dimensions for heterogeneous node types during
inference to prevent unnecessary loss of information. Experiments show that
BHG-based methods significantly outperform state-of-the-art knowledge infusion
methods and show generalized knowledge infusion ability with higher efficiency.
Further analysis proves that previous empirical knowledge filtering methods do
not guarantee to provide the most useful knowledge information. Our code is
available at: https://github.com/SteveKGYang/BHG.
Related papers
- Open Visual Knowledge Extraction via Relation-Oriented Multimodality
Model Prompting [89.95541601837719]
We take a first exploration to a new paradigm of open visual knowledge extraction.
OpenVik consists of an open relational region detector to detect regions potentially containing relational knowledge.
A visual knowledge generator that generates format-free knowledge by prompting the large multimodality model with the detected region of interest.
arXiv Detail & Related papers (2023-10-28T20:09:29Z) - Recognizing Unseen Objects via Multimodal Intensive Knowledge Graph
Propagation [68.13453771001522]
We propose a multimodal intensive ZSL framework that matches regions of images with corresponding semantic embeddings.
We conduct extensive experiments and evaluate our model on large-scale real-world data.
arXiv Detail & Related papers (2023-06-14T13:07:48Z) - CADGE: Context-Aware Dialogue Generation Enhanced with Graph-Structured Knowledge Aggregation [25.56539617837482]
A novel context-aware graph-attention model (Context-aware GAT) is proposed.
It assimilates global features from relevant knowledge graphs through a context-enhanced knowledge aggregation mechanism.
Empirical results demonstrate that our framework outperforms conventional GNN-based language models in terms of performance.
arXiv Detail & Related papers (2023-05-10T16:31:35Z) - RHO ($\rho$): Reducing Hallucination in Open-domain Dialogues with
Knowledge Grounding [57.46495388734495]
This paper presents RHO ($rho$) utilizing the representations of linked entities and relation predicates from a knowledge graph (KG)
We propose (1) local knowledge grounding to combine textual embeddings with the corresponding KG embeddings; and (2) global knowledge grounding to equip RHO with multi-hop reasoning abilities via the attention mechanism.
arXiv Detail & Related papers (2022-12-03T10:36:34Z) - Knowledge Graph Augmented Network Towards Multiview Representation
Learning for Aspect-based Sentiment Analysis [96.53859361560505]
We propose a knowledge graph augmented network (KGAN) to incorporate external knowledge with explicitly syntactic and contextual information.
KGAN captures the sentiment feature representations from multiple perspectives, i.e., context-, syntax- and knowledge-based.
Experiments on three popular ABSA benchmarks demonstrate the effectiveness and robustness of our KGAN.
arXiv Detail & Related papers (2022-01-13T08:25:53Z) - Knowledge-Grounded Dialogue Generation with a Unified Knowledge
Representation [78.85622982191522]
Existing systems perform poorly on unseen topics due to limited topics covered in the training data.
We present PLUG, a language model that homogenizes different knowledge sources to a unified knowledge representation.
It can achieve comparable performance with state-of-the-art methods under a fully-supervised setting.
arXiv Detail & Related papers (2021-12-15T07:11:02Z) - Distilling Holistic Knowledge with Graph Neural Networks [37.86539695906857]
Knowledge Distillation (KD) aims at transferring knowledge from a larger well-optimized teacher network to a smaller learnable student network.
Existing KD methods have mainly considered two types of knowledge, namely the individual knowledge and the relational knowledge.
We propose to distill the novel holistic knowledge based on an attributed graph constructed among instances.
arXiv Detail & Related papers (2021-08-12T02:47:59Z) - KRISP: Integrating Implicit and Symbolic Knowledge for Open-Domain
Knowledge-Based VQA [107.7091094498848]
One of the most challenging question types in VQA is when answering the question requires outside knowledge not present in the image.
In this work we study open-domain knowledge, the setting when the knowledge required to answer a question is not given/annotated, neither at training nor test time.
We tap into two types of knowledge representations and reasoning. First, implicit knowledge which can be learned effectively from unsupervised language pre-training and supervised training data with transformer-based models.
arXiv Detail & Related papers (2020-12-20T20:13:02Z) - Knowledge-graph based Proactive Dialogue Generation with Improved
Meta-Learning [0.0]
We propose a knowledge graph based proactive dialogue generation model (KgDg) with three components.
For knowledge triplets embedding and selection, we formulate it as a problem of sentence embedding to better capture semantic information.
Our improved MAML algorithm is capable of learning general features from a limited number of knowledge graphs.
arXiv Detail & Related papers (2020-04-19T08:41:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.