SpikE: spike-based embeddings for multi-relational graph data
- URL: http://arxiv.org/abs/2104.13398v1
- Date: Tue, 27 Apr 2021 18:00:12 GMT
- Title: SpikE: spike-based embeddings for multi-relational graph data
- Authors: Dominik Dold, Josep Soler Garrido
- Abstract summary: spiking neural networks are still mostly applied to tasks stemming from sensory processing.
A rich data representation that finds wide application in industry and research is the so-called knowledge graph.
We propose a spike-based algorithm where nodes in a graph are represented by single spike times of neuron populations.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Despite the recent success of reconciling spike-based coding with the error
backpropagation algorithm, spiking neural networks are still mostly applied to
tasks stemming from sensory processing, operating on traditional data
structures like visual or auditory data. A rich data representation that finds
wide application in industry and research is the so-called knowledge graph - a
graph-based structure where entities are depicted as nodes and relations
between them as edges. Complex systems like molecules, social networks and
industrial factory systems can be described using the common language of
knowledge graphs, allowing the usage of graph embedding algorithms to make
context-aware predictions in these information-packed environments. We propose
a spike-based algorithm where nodes in a graph are represented by single spike
times of neuron populations and relations as spike time differences between
populations. Learning such spike-based embeddings only requires knowledge about
spike times and spike time differences, compatible with recently proposed
frameworks for training spiking neural networks. The presented model is easily
mapped to current neuromorphic hardware systems and thereby moves inference on
knowledge graphs into a domain where these architectures thrive, unlocking a
promising industrial application area for this technology.
Related papers
- Everything is Connected: Graph Neural Networks [0.0]
This short survey aims to enable the reader to assimilate the key concepts in the area of graph representation learning.
The main aim of this short survey is to enable the reader to assimilate the key concepts in the area, and position graph representation learning in a proper context with related fields.
arXiv Detail & Related papers (2023-01-19T18:09:43Z) - State of the Art and Potentialities of Graph-level Learning [54.68482109186052]
Graph-level learning has been applied to many tasks including comparison, regression, classification, and more.
Traditional approaches to learning a set of graphs rely on hand-crafted features, such as substructures.
Deep learning has helped graph-level learning adapt to the growing scale of graphs by extracting features automatically and encoding graphs into low-dimensional representations.
arXiv Detail & Related papers (2023-01-14T09:15:49Z) - Neuro-symbolic computing with spiking neural networks [0.6035125735474387]
We extend previous work on spike-based graph algorithms by demonstrating how symbolic and multi-relational information can be encoded using spiking neurons.
The introduced framework is enabled by combining the graph embedding paradigm and the recent progress in training spiking neural networks using error backpropagation.
arXiv Detail & Related papers (2022-08-04T10:49:34Z) - Relational representation learning with spike trains [0.0]
We present a model that allows us to learn spike train-based embeddings of knowledge graphs, requiring only one neuron per graph element by fully utilizing the temporal domain of spike patterns.
In general, the presented results show how relational knowledge can be integrated into spike-based systems, opening up the possibility of merging event-based computing and data to build powerful and energy efficient artificial intelligence applications and reasoning systems.
arXiv Detail & Related papers (2022-05-18T18:00:37Z) - Hyperbolic Graph Neural Networks: A Review of Methods and Applications [55.5502008501764]
Graph neural networks generalize conventional neural networks to graph-structured data.
The performance of Euclidean models in graph-related learning is still bounded and limited by the representation ability of Euclidean geometry.
Recently, hyperbolic space has gained increasing popularity in processing graph data with tree-like structure and power-law distribution.
arXiv Detail & Related papers (2022-02-28T15:08:48Z) - Learning through structure: towards deep neuromorphic knowledge graph
embeddings [0.5906031288935515]
We propose a strategy to map deep graph learning architectures for knowledge graph reasoning to neuromorphic architectures.
Based on the insight that randomly and untrained graph neural networks are able to preserve local graph structures, we compose a frozen neural network shallow knowledge graph embedding models.
We experimentally show that already on conventional computing hardware, this leads to a significant speedup and memory reduction while maintaining a competitive performance level.
arXiv Detail & Related papers (2021-09-21T18:01:04Z) - Exploiting Heterogeneous Graph Neural Networks with Latent Worker/Task
Correlation Information for Label Aggregation in Crowdsourcing [72.34616482076572]
Crowdsourcing has attracted much attention for its convenience to collect labels from non-expert workers instead of experts.
We propose a novel framework based on graph neural networks for aggregating crowd labels.
arXiv Detail & Related papers (2020-10-25T10:12:37Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z) - A Heterogeneous Graph with Factual, Temporal and Logical Knowledge for
Question Answering Over Dynamic Contexts [81.4757750425247]
We study question answering over a dynamic textual environment.
We develop a graph neural network over the constructed graph, and train the model in an end-to-end manner.
arXiv Detail & Related papers (2020-04-25T04:53:54Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.