Ring Reservoir Neural Networks for Graphs
- URL: http://arxiv.org/abs/2005.05294v1
- Date: Mon, 11 May 2020 17:51:40 GMT
- Title: Ring Reservoir Neural Networks for Graphs
- Authors: Claudio Gallicchio and Alessio Micheli
- Abstract summary: Reservoir Computing models can play an important role in developing fruitful graph embeddings.
Our core proposal is based on shaping the organization of the hidden neurons to follow a ring topology.
Experimental results on graph classification tasks indicate that ring-reservoirs architectures enable particularly effective network configurations.
- Score: 15.07984894938396
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Machine Learning for graphs is nowadays a research topic of consolidated
relevance. Common approaches in the field typically resort to complex deep
neural network architectures and demanding training algorithms, highlighting
the need for more efficient solutions. The class of Reservoir Computing (RC)
models can play an important role in this context, enabling to develop fruitful
graph embeddings through untrained recursive architectures. In this paper, we
study progressive simplifications to the design strategy of RC neural networks
for graphs. Our core proposal is based on shaping the organization of the
hidden neurons to follow a ring topology. Experimental results on graph
classification tasks indicate that ring-reservoirs architectures enable
particularly effective network configurations, showing consistent advantages in
terms of predictive performance.
Related papers
- Enhancing Graph Representation Learning with Attention-Driven Spiking Neural Networks [5.627287101959473]
Spiking neural networks (SNNs) have emerged as a promising alternative to traditional neural networks for graph learning tasks.
We propose a novel approach that integrates attention mechanisms with SNNs to improve graph representation learning.
arXiv Detail & Related papers (2024-03-25T12:15:10Z) - Spiking Graph Convolutional Networks [19.36064180392385]
SpikingGCN is an end-to-end framework that aims to integrate the embedding of GCNs with the biofidelity characteristics of SNNs.
We show that SpikingGCN on a neuromorphic chip can bring a clear advantage of energy efficiency into graph data analysis.
arXiv Detail & Related papers (2022-05-05T16:44:36Z) - Inducing Gaussian Process Networks [80.40892394020797]
We propose inducing Gaussian process networks (IGN), a simple framework for simultaneously learning the feature space as well as the inducing points.
The inducing points, in particular, are learned directly in the feature space, enabling a seamless representation of complex structured domains.
We report on experimental results for real-world data sets showing that IGNs provide significant advances over state-of-the-art methods.
arXiv Detail & Related papers (2022-04-21T05:27:09Z) - Representation Learning of Reconstructed Graphs Using Random Walk Graph
Convolutional Network [12.008472517000651]
We propose wGCN -- a novel framework that utilizes random walk to obtain the node-specific mesoscopic structures of the graph.
We believe that combining high-order local structural information can more efficiently explore the potential of the network.
arXiv Detail & Related papers (2021-01-02T10:31:14Z) - Analyzing the Performance of Graph Neural Networks with Pipe Parallelism [2.269587850533721]
We focus on Graph Neural Networks (GNNs) that have found great success in tasks such as node or edge classification and link prediction.
New approaches for processing larger networks are needed to advance graph techniques.
We study how GNNs could be parallelized using existing tools and frameworks that are known to be successful in the deep learning community.
arXiv Detail & Related papers (2020-12-20T04:20:38Z) - Learning Connectivity of Neural Networks from a Topological Perspective [80.35103711638548]
We propose a topological perspective to represent a network into a complete graph for analysis.
By assigning learnable parameters to the edges which reflect the magnitude of connections, the learning process can be performed in a differentiable manner.
This learning process is compatible with existing networks and owns adaptability to larger search spaces and different tasks.
arXiv Detail & Related papers (2020-08-19T04:53:31Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z) - Geometrically Principled Connections in Graph Neural Networks [66.51286736506658]
We argue geometry should remain the primary driving force behind innovation in the emerging field of geometric deep learning.
We relate graph neural networks to widely successful computer graphics and data approximation models: radial basis functions (RBFs)
We introduce affine skip connections, a novel building block formed by combining a fully connected layer with any graph convolution operator.
arXiv Detail & Related papers (2020-04-06T13:25:46Z) - Progressive Graph Convolutional Networks for Semi-Supervised Node
Classification [97.14064057840089]
Graph convolutional networks have been successful in addressing graph-based tasks such as semi-supervised node classification.
We propose a method to automatically build compact and task-specific graph convolutional networks.
arXiv Detail & Related papers (2020-03-27T08:32:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.