Dynamic Relevance Graph Network for Knowledge-Aware Question Answering
- URL: http://arxiv.org/abs/2209.09947v1
- Date: Tue, 20 Sep 2022 18:52:05 GMT
- Title: Dynamic Relevance Graph Network for Knowledge-Aware Question Answering
- Authors: Chen Zheng and Parisa Kordjamshidi
- Abstract summary: This work investigates the challenge of learning and reasoning for Commonsense Question Answering given an external source of knowledge.
We propose a novel graph neural network architecture, called Dynamic Relevance Graph Network (DRGN)
DRGN operates on a given KG subgraph based on the question and answers entities and uses the relevance scores between the nodes to establish new edges.
- Score: 22.06211725256875
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This work investigates the challenge of learning and reasoning for
Commonsense Question Answering given an external source of knowledge in the
form of a knowledge graph (KG). We propose a novel graph neural network
architecture, called Dynamic Relevance Graph Network (DRGN). DRGN operates on a
given KG subgraph based on the question and answers entities and uses the
relevance scores between the nodes to establish new edges dynamically for
learning node representations in the graph network. This explicit usage of
relevance as graph edges has the following advantages, a) the model can exploit
the existing relationships, re-scale the node weights, and influence the way
the neighborhood nodes' representations are aggregated in the KG subgraph, b)
It potentially recovers the missing edges in KG that are needed for reasoning.
Moreover, as a byproduct, our model improves handling the negative questions
due to considering the relevance between the question node and the graph
entities. Our proposed approach shows competitive performance on two QA
benchmarks, CommonsenseQA and OpenbookQA, compared to the state-of-the-art
published results.
Related papers
- Neighbor Overlay-Induced Graph Attention Network [5.792501481702088]
Graph neural networks (GNNs) have garnered significant attention due to their ability to represent graph data.
This study proposes a neighbor overlay-induced graph attention network (NO-GAT) with the following two-fold ideas.
Empirical studies on graph benchmark datasets indicate that the proposed NO-GAT consistently outperforms state-of-the-art models.
arXiv Detail & Related papers (2024-08-16T15:01:28Z) - PipeNet: Question Answering with Semantic Pruning over Knowledge Graphs [56.5262495514563]
We propose a grounding-pruning-reasoning pipeline to prune noisy computation nodes.
We also propose a graph attention network (GAT) based module to reason with the subgraph data.
arXiv Detail & Related papers (2024-01-31T01:37:33Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - Generating Post-hoc Explanations for Skip-gram-based Node Embeddings by
Identifying Important Nodes with Bridgeness [19.448849238643582]
unsupervised node embedding methods such as DeepWalk, LINE, struc2vec, PTE, UserItem2vec, and RWJBG have emerged from the Skip-gram model.
In this paper, we show that global explanations to the Skip-gram-based embeddings can be found by computing bridgeness under a spectral cluster-aware local perturbation.
A novel gradient-based explanation method, which we call GRAPH-wGD, is proposed that allows the top-q global explanations about learned graph embedding vectors more efficiently.
arXiv Detail & Related papers (2023-04-24T12:25:35Z) - Question-Answer Sentence Graph for Joint Modeling Answer Selection [122.29142965960138]
We train and integrate state-of-the-art (SOTA) models for computing scores between question-question, question-answer, and answer-answer pairs.
Online inference is then performed to solve the AS2 task on unseen queries.
arXiv Detail & Related papers (2022-02-16T05:59:53Z) - JointLK: Joint Reasoning with Language Models and Knowledge Graphs for
Commonsense Question Answering [3.7948783125888363]
Existing KG-augmented models for question answering primarily focus on designing elaborate Graph Neural Networks (GNNs) to model knowledge graphs (KGs)
We propose a novel model, JointLK, which solves the above limitations through the joint reasoning of LMs and GNNs and the dynamic KGs pruning mechanism.
Our results on the CommonsenseQA and OpenBookQA datasets demonstrate that our modal fusion and knowledge pruning methods can make better use of relevant knowledge for reasoning.
arXiv Detail & Related papers (2021-12-06T01:46:46Z) - Explicit Pairwise Factorized Graph Neural Network for Semi-Supervised
Node Classification [59.06717774425588]
We propose the Explicit Pairwise Factorized Graph Neural Network (EPFGNN), which models the whole graph as a partially observed Markov Random Field.
It contains explicit pairwise factors to model output-output relations and uses a GNN backbone to model input-output relations.
We conduct experiments on various datasets, which shows that our model can effectively improve the performance for semi-supervised node classification on graphs.
arXiv Detail & Related papers (2021-07-27T19:47:53Z) - Learning Intents behind Interactions with Knowledge Graph for
Recommendation [93.08709357435991]
Knowledge graph (KG) plays an increasingly important role in recommender systems.
Existing GNN-based models fail to identify user-item relation at a fine-grained level of intents.
We propose a new model, Knowledge Graph-based Intent Network (KGIN)
arXiv Detail & Related papers (2021-02-14T03:21:36Z) - Toward Subgraph-Guided Knowledge Graph Question Generation with Graph
Neural Networks [53.58077686470096]
Knowledge graph (KG) question generation (QG) aims to generate natural language questions from KGs and target answers.
In this work, we focus on a more realistic setting where we aim to generate questions from a KG subgraph and target answers.
arXiv Detail & Related papers (2020-04-13T15:43:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.