Neural-Symbolic Relational Reasoning on Graph Models: Effective Link
Inference and Computation from Knowledge Bases
- URL: http://arxiv.org/abs/2005.02525v1
- Date: Tue, 5 May 2020 22:46:39 GMT
- Title: Neural-Symbolic Relational Reasoning on Graph Models: Effective Link
Inference and Computation from Knowledge Bases
- Authors: Henrique Lemos and Pedro Avelar and Marcelo Prates and Lu\'is Lamb and
Artur Garcez
- Abstract summary: We propose a neural-symbolic graph which applies learning over all the paths by feeding the model with the embedding of the minimal network of the knowledge graph containing such paths.
By learning to produce representations for entities and facts corresponding to word embeddings, we show how the model can be trained end-to-end to decode these representations and infer relations between entities in a relational approach.
- Score: 0.5669790037378094
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The recent developments and growing interest in neural-symbolic models has
shown that hybrid approaches can offer richer models for Artificial
Intelligence. The integration of effective relational learning and reasoning
methods is one of the key challenges in this direction, as neural learning and
symbolic reasoning offer complementary characteristics that can benefit the
development of AI systems. Relational labelling or link prediction on knowledge
graphs has become one of the main problems in deep learning-based natural
language processing research. Moreover, other fields which make use of
neural-symbolic techniques may also benefit from such research endeavours.
There have been several efforts towards the identification of missing facts
from existing ones in knowledge graphs. Two lines of research try and predict
knowledge relations between two entities by considering all known facts
connecting them or several paths of facts connecting them. We propose a
neural-symbolic graph neural network which applies learning over all the paths
by feeding the model with the embedding of the minimal subset of the knowledge
graph containing such paths. By learning to produce representations for
entities and facts corresponding to word embeddings, we show how the model can
be trained end-to-end to decode these representations and infer relations
between entities in a multitask approach. Our contribution is two-fold: a
neural-symbolic methodology leverages the resolution of relational inference in
large graphs, and we also demonstrate that such neural-symbolic model is shown
more effective than path-based approaches
Related papers
- A Novel Neural-symbolic System under Statistical Relational Learning [50.747658038910565]
We propose a general bi-level probabilistic graphical reasoning framework called GBPGR.
In GBPGR, the results of symbolic reasoning are utilized to refine and correct the predictions made by the deep learning models.
Our approach achieves high performance and exhibits effective generalization in both transductive and inductive tasks.
arXiv Detail & Related papers (2023-09-16T09:15:37Z) - Neurosymbolic AI for Reasoning over Knowledge Graphs: A Survey [0.0]
We survey methods that perform neurosymbolic reasoning tasks on knowledge graphs and propose a novel taxonomy by which we can classify them.
Specifically, we propose three major categories: (1) logically-informed embedding approaches, (2) embedding approaches with logical constraints, and (3) rule learning approaches.
arXiv Detail & Related papers (2023-02-14T17:24:30Z) - A Theory of Link Prediction via Relational Weisfeiler-Leman on Knowledge
Graphs [6.379544211152605]
Graph neural networks are prominent models for representation learning over graph-structured data.
Our goal is to provide a systematic understanding of the landscape of graph neural networks for knowledge graphs.
arXiv Detail & Related papers (2023-02-04T17:40:03Z) - Neuro-symbolic computing with spiking neural networks [0.6035125735474387]
We extend previous work on spike-based graph algorithms by demonstrating how symbolic and multi-relational information can be encoded using spiking neurons.
The introduced framework is enabled by combining the graph embedding paradigm and the recent progress in training spiking neural networks using error backpropagation.
arXiv Detail & Related papers (2022-08-04T10:49:34Z) - Contrastive Brain Network Learning via Hierarchical Signed Graph Pooling
Model [64.29487107585665]
Graph representation learning techniques on brain functional networks can facilitate the discovery of novel biomarkers for clinical phenotypes and neurodegenerative diseases.
Here, we propose an interpretable hierarchical signed graph representation learning model to extract graph-level representations from brain functional networks.
In order to further improve the model performance, we also propose a new strategy to augment functional brain network data for contrastive learning.
arXiv Detail & Related papers (2022-07-14T20:03:52Z) - Neuro-Symbolic Learning of Answer Set Programs from Raw Data [54.56905063752427]
Neuro-Symbolic AI aims to combine interpretability of symbolic techniques with the ability of deep learning to learn from raw data.
We introduce Neuro-Symbolic Inductive Learner (NSIL), an approach that trains a general neural network to extract latent concepts from raw data.
NSIL learns expressive knowledge, solves computationally complex problems, and achieves state-of-the-art performance in terms of accuracy and data efficiency.
arXiv Detail & Related papers (2022-05-25T12:41:59Z) - Relational representation learning with spike trains [0.0]
We present a model that allows us to learn spike train-based embeddings of knowledge graphs, requiring only one neuron per graph element by fully utilizing the temporal domain of spike patterns.
In general, the presented results show how relational knowledge can be integrated into spike-based systems, opening up the possibility of merging event-based computing and data to build powerful and energy efficient artificial intelligence applications and reasoning systems.
arXiv Detail & Related papers (2022-05-18T18:00:37Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - pix2rule: End-to-end Neuro-symbolic Rule Learning [84.76439511271711]
This paper presents a complete neuro-symbolic method for processing images into objects, learning relations and logical rules.
The main contribution is a differentiable layer in a deep learning architecture from which symbolic relations and rules can be extracted.
We demonstrate that our model scales beyond state-of-the-art symbolic learners and outperforms deep relational neural network architectures.
arXiv Detail & Related papers (2021-06-14T15:19:06Z) - A Heterogeneous Graph with Factual, Temporal and Logical Knowledge for
Question Answering Over Dynamic Contexts [81.4757750425247]
We study question answering over a dynamic textual environment.
We develop a graph neural network over the constructed graph, and train the model in an end-to-end manner.
arXiv Detail & Related papers (2020-04-25T04:53:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.