Bosonic Random Walk Networks for Graph Learning
- URL: http://arxiv.org/abs/2101.00082v1
- Date: Thu, 31 Dec 2020 21:40:40 GMT
- Title: Bosonic Random Walk Networks for Graph Learning
- Authors: Shiv Shankar, Don Towsley
- Abstract summary: We explore applications of multi-particle quantum walks on diffusing information across graphs.
Our model is based on learning the operators that govern the dynamics of quantum random walkers on graphs.
We demonstrate the effectiveness of our method on classification and regression tasks.
- Score: 32.24009574184356
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The development of Graph Neural Networks (GNNs) has led to great progress in
machine learning on graph-structured data. These networks operate via diffusing
information across the graph nodes while capturing the structure of the graph.
Recently there has also seen tremendous progress in quantum computing
techniques. In this work, we explore applications of multi-particle quantum
walks on diffusing information across graphs. Our model is based on learning
the operators that govern the dynamics of quantum random walkers on graphs. We
demonstrate the effectiveness of our method on classification and regression
tasks.
Related papers
- Graphs Unveiled: Graph Neural Networks and Graph Generation [0.0]
This paper provides a comprehensive overview of Graph Neural Networks (GNNs)
We discuss the applications of graph neural networks across various domains.
We present an advanced field in GNNs: graph generation.
arXiv Detail & Related papers (2024-03-18T14:37:27Z) - Training Graph Neural Networks on Growing Stochastic Graphs [114.75710379125412]
Graph Neural Networks (GNNs) rely on graph convolutions to exploit meaningful patterns in networked data.
We propose to learn GNNs on very large graphs by leveraging the limit object of a sequence of growing graphs, the graphon.
arXiv Detail & Related papers (2022-10-27T16:00:45Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - A generative neural network model for random dot product graphs [1.1421942894219896]
GraphMoE is a novel approach to learning generative models for random graphs.
The neural network is trained to match the distribution of a class of random graphs by way of a moment estimator.
arXiv Detail & Related papers (2022-04-15T19:59:22Z) - Hyperbolic Graph Neural Networks: A Review of Methods and Applications [55.5502008501764]
Graph neural networks generalize conventional neural networks to graph-structured data.
The performance of Euclidean models in graph-related learning is still bounded and limited by the representation ability of Euclidean geometry.
Recently, hyperbolic space has gained increasing popularity in processing graph data with tree-like structure and power-law distribution.
arXiv Detail & Related papers (2022-02-28T15:08:48Z) - Learning through structure: towards deep neuromorphic knowledge graph
embeddings [0.5906031288935515]
We propose a strategy to map deep graph learning architectures for knowledge graph reasoning to neuromorphic architectures.
Based on the insight that randomly and untrained graph neural networks are able to preserve local graph structures, we compose a frozen neural network shallow knowledge graph embedding models.
We experimentally show that already on conventional computing hardware, this leads to a significant speedup and memory reduction while maintaining a competitive performance level.
arXiv Detail & Related papers (2021-09-21T18:01:04Z) - Increase and Conquer: Training Graph Neural Networks on Growing Graphs [116.03137405192356]
We consider the problem of learning a graphon neural network (WNN) by training GNNs on graphs sampled Bernoulli from the graphon.
Inspired by these results, we propose an algorithm to learn GNNs on large-scale graphs that, starting from a moderate number of nodes, successively increases the size of the graph during training.
arXiv Detail & Related papers (2021-06-07T15:05:59Z) - Quantum machine learning of graph-structured data [0.38581147665516596]
We consider graph-structured quantum data and describe how to carry out its quantum machine learning via quantum neural networks.
We explain how to systematically exploit this additional graph structure to improve quantum learning algorithms.
arXiv Detail & Related papers (2021-03-19T14:39:19Z) - Learning Graph Representations [0.0]
Graph Neural Networks (GNNs) are efficient ways to get insight into large dynamic graph datasets.
In this paper, we discuss the graph convolutional neural networks graph autoencoders and Social-temporal graph neural networks.
arXiv Detail & Related papers (2021-02-03T12:07:55Z) - Graph-Based Neural Network Models with Multiple Self-Supervised
Auxiliary Tasks [79.28094304325116]
Graph Convolutional Networks are among the most promising approaches for capturing relationships among structured data points.
We propose three novel self-supervised auxiliary tasks to train graph-based neural network models in a multi-task fashion.
arXiv Detail & Related papers (2020-11-14T11:09:51Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.