Learning through structure: towards deep neuromorphic knowledge graph
embeddings
- URL: http://arxiv.org/abs/2109.10376v1
- Date: Tue, 21 Sep 2021 18:01:04 GMT
- Title: Learning through structure: towards deep neuromorphic knowledge graph
embeddings
- Authors: Victor Caceres Chian, Marcel Hildebrandt, Thomas Runkler, Dominik Dold
- Abstract summary: We propose a strategy to map deep graph learning architectures for knowledge graph reasoning to neuromorphic architectures.
Based on the insight that randomly and untrained graph neural networks are able to preserve local graph structures, we compose a frozen neural network shallow knowledge graph embedding models.
We experimentally show that already on conventional computing hardware, this leads to a significant speedup and memory reduction while maintaining a competitive performance level.
- Score: 0.5906031288935515
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Computing latent representations for graph-structured data is an ubiquitous
learning task in many industrial and academic applications ranging from
molecule synthetization to social network analysis and recommender systems.
Knowledge graphs are among the most popular and widely used data
representations related to the Semantic Web. Next to structuring factual
knowledge in a machine-readable format, knowledge graphs serve as the backbone
of many artificial intelligence applications and allow the ingestion of context
information into various learning algorithms. Graph neural networks attempt to
encode graph structures in low-dimensional vector spaces via a message passing
heuristic between neighboring nodes. Over the recent years, a multitude of
different graph neural network architectures demonstrated ground-breaking
performances in many learning tasks. In this work, we propose a strategy to map
deep graph learning architectures for knowledge graph reasoning to neuromorphic
architectures. Based on the insight that randomly initialized and untrained
(i.e., frozen) graph neural networks are able to preserve local graph
structures, we compose a frozen neural network with shallow knowledge graph
embedding models. We experimentally show that already on conventional computing
hardware, this leads to a significant speedup and memory reduction while
maintaining a competitive performance level. Moreover, we extend the frozen
architecture to spiking neural networks, introducing a novel, event-based and
highly sparse knowledge graph embedding algorithm that is suitable for
implementation in neuromorphic hardware.
Related papers
- Knowledge Enhanced Graph Neural Networks for Graph Completion [0.0]
Knowledge Enhanced Graph Neural Networks (KeGNN) is a neuro-symbolic framework for graph completion.
KeGNN consists of a graph neural network as a base upon which knowledge enhancement layers are stacked.
We instantiate KeGNN in conjunction with two state-of-the-art graph neural networks, Graph Convolutional Networks and Graph Attention Networks.
arXiv Detail & Related papers (2023-03-27T07:53:43Z) - State of the Art and Potentialities of Graph-level Learning [54.68482109186052]
Graph-level learning has been applied to many tasks including comparison, regression, classification, and more.
Traditional approaches to learning a set of graphs rely on hand-crafted features, such as substructures.
Deep learning has helped graph-level learning adapt to the growing scale of graphs by extracting features automatically and encoding graphs into low-dimensional representations.
arXiv Detail & Related papers (2023-01-14T09:15:49Z) - Hyperbolic Graph Neural Networks: A Review of Methods and Applications [55.5502008501764]
Graph neural networks generalize conventional neural networks to graph-structured data.
The performance of Euclidean models in graph-related learning is still bounded and limited by the representation ability of Euclidean geometry.
Recently, hyperbolic space has gained increasing popularity in processing graph data with tree-like structure and power-law distribution.
arXiv Detail & Related papers (2022-02-28T15:08:48Z) - SpikE: spike-based embeddings for multi-relational graph data [0.0]
spiking neural networks are still mostly applied to tasks stemming from sensory processing.
A rich data representation that finds wide application in industry and research is the so-called knowledge graph.
We propose a spike-based algorithm where nodes in a graph are represented by single spike times of neuron populations.
arXiv Detail & Related papers (2021-04-27T18:00:12Z) - Learning Graph Representations [0.0]
Graph Neural Networks (GNNs) are efficient ways to get insight into large dynamic graph datasets.
In this paper, we discuss the graph convolutional neural networks graph autoencoders and Social-temporal graph neural networks.
arXiv Detail & Related papers (2021-02-03T12:07:55Z) - Co-embedding of Nodes and Edges with Graph Neural Networks [13.020745622327894]
Graph embedding is a way to transform and encode the data structure in high dimensional and non-Euclidean feature space.
CensNet is a general graph embedding framework, which embeds both nodes and edges to a latent feature space.
Our approach achieves or matches the state-of-the-art performance in four graph learning tasks.
arXiv Detail & Related papers (2020-10-25T22:39:31Z) - Graph Structure of Neural Networks [104.33754950606298]
We show how the graph structure of neural networks affect their predictive performance.
A "sweet spot" of relational graphs leads to neural networks with significantly improved predictive performance.
Top-performing neural networks have graph structure surprisingly similar to those of real biological neural networks.
arXiv Detail & Related papers (2020-07-13T17:59:31Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z) - A Heterogeneous Graph with Factual, Temporal and Logical Knowledge for
Question Answering Over Dynamic Contexts [81.4757750425247]
We study question answering over a dynamic textual environment.
We develop a graph neural network over the constructed graph, and train the model in an end-to-end manner.
arXiv Detail & Related papers (2020-04-25T04:53:54Z) - Hcore-Init: Neural Network Initialization based on Graph Degeneracy [22.923756039561194]
We propose an adapted version of the k-core structure for the complete weighted multipartite graph extracted from a deep learning architecture.
As a multipartite graph is a combination of bipartite graphs, that are in turn the incidence graphs of hypergraphs, we design k-hypercore decomposition.
arXiv Detail & Related papers (2020-04-16T12:57:14Z) - Geometrically Principled Connections in Graph Neural Networks [66.51286736506658]
We argue geometry should remain the primary driving force behind innovation in the emerging field of geometric deep learning.
We relate graph neural networks to widely successful computer graphics and data approximation models: radial basis functions (RBFs)
We introduce affine skip connections, a novel building block formed by combining a fully connected layer with any graph convolution operator.
arXiv Detail & Related papers (2020-04-06T13:25:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.