A Heterogeneous Graph with Factual, Temporal and Logical Knowledge for
Question Answering Over Dynamic Contexts
- URL: http://arxiv.org/abs/2004.12057v1
- Date: Sat, 25 Apr 2020 04:53:54 GMT
- Title: A Heterogeneous Graph with Factual, Temporal and Logical Knowledge for
Question Answering Over Dynamic Contexts
- Authors: Wanjun Zhong, Duyu Tang, Nan Duan, Ming Zhou, Jiahai Wang, Jian Yin
- Abstract summary: We study question answering over a dynamic textual environment.
We develop a graph neural network over the constructed graph, and train the model in an end-to-end manner.
- Score: 81.4757750425247
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study question answering over a dynamic textual environment. Although
neural network models achieve impressive accuracy via learning from
input-output examples, they rarely leverage various types of knowledge and are
generally not interpretable. In this work, we propose a graph-based approach,
where a heterogeneous graph is automatically built with factual knowledge of
the context, temporal knowledge of the past states, and logical knowledge that
combines human-curated knowledge bases and rule bases. We develop a graph
neural network over the constructed graph, and train the model in an end-to-end
manner. Experimental results on a benchmark dataset show that the injection of
various types of knowledge improves a strong neural network baseline. An
additional benefit of our approach is that the graph itself naturally serves as
a rational behind the decision making.
Related papers
- Intrinsically motivated graph exploration using network theories of
human curiosity [71.2717061477241]
We propose a novel approach for exploring graph-structured data motivated by two theories of human curiosity.
We use these proposed features as rewards for graph neural-network-based reinforcement learning.
arXiv Detail & Related papers (2023-07-11T01:52:08Z) - Probing Graph Representations [77.7361299039905]
We use a probing framework to quantify the amount of meaningful information captured in graph representations.
Our findings on molecular datasets show the potential of probing for understanding the inductive biases of graph-based models.
We advocate for probing as a useful diagnostic tool for evaluating graph-based models.
arXiv Detail & Related papers (2023-03-07T14:58:18Z) - A Theory of Link Prediction via Relational Weisfeiler-Leman on Knowledge
Graphs [6.379544211152605]
Graph neural networks are prominent models for representation learning over graph-structured data.
Our goal is to provide a systematic understanding of the landscape of graph neural networks for knowledge graphs.
arXiv Detail & Related papers (2023-02-04T17:40:03Z) - State of the Art and Potentialities of Graph-level Learning [54.68482109186052]
Graph-level learning has been applied to many tasks including comparison, regression, classification, and more.
Traditional approaches to learning a set of graphs rely on hand-crafted features, such as substructures.
Deep learning has helped graph-level learning adapt to the growing scale of graphs by extracting features automatically and encoding graphs into low-dimensional representations.
arXiv Detail & Related papers (2023-01-14T09:15:49Z) - Relational representation learning with spike trains [0.0]
We present a model that allows us to learn spike train-based embeddings of knowledge graphs, requiring only one neuron per graph element by fully utilizing the temporal domain of spike patterns.
In general, the presented results show how relational knowledge can be integrated into spike-based systems, opening up the possibility of merging event-based computing and data to build powerful and energy efficient artificial intelligence applications and reasoning systems.
arXiv Detail & Related papers (2022-05-18T18:00:37Z) - Learning to Evolve on Dynamic Graphs [5.1521870302904125]
Learning to Evolve on Dynamic Graphs (LEDG) is a novel algorithm that jointly learns graph information and time information.
LEDG is model-agnostic and can train any message passing based graph neural network (GNN) on dynamic graphs.
arXiv Detail & Related papers (2021-11-13T04:09:30Z) - Learning through structure: towards deep neuromorphic knowledge graph
embeddings [0.5906031288935515]
We propose a strategy to map deep graph learning architectures for knowledge graph reasoning to neuromorphic architectures.
Based on the insight that randomly and untrained graph neural networks are able to preserve local graph structures, we compose a frozen neural network shallow knowledge graph embedding models.
We experimentally show that already on conventional computing hardware, this leads to a significant speedup and memory reduction while maintaining a competitive performance level.
arXiv Detail & Related papers (2021-09-21T18:01:04Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z) - Graph Structure of Neural Networks [104.33754950606298]
We show how the graph structure of neural networks affect their predictive performance.
A "sweet spot" of relational graphs leads to neural networks with significantly improved predictive performance.
Top-performing neural networks have graph structure surprisingly similar to those of real biological neural networks.
arXiv Detail & Related papers (2020-07-13T17:59:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.