Machine learning of percolation models using graph convolutional neural
networks
- URL: http://arxiv.org/abs/2207.03368v2
- Date: Fri, 7 Apr 2023 13:19:43 GMT
- Title: Machine learning of percolation models using graph convolutional neural
networks
- Authors: Hua Tian, Lirong Zhang, Youjin Deng, and Wanzhou Zhang
- Abstract summary: Prediction of percolation thresholds with machine learning methods remains challenging.
We build a powerful graph convolutional neural network to study the percolation in both supervised and unsupervised ways.
- Score: 1.0499611180329804
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Percolation is an important topic in climate, physics, materials science,
epidemiology, finance, and so on. Prediction of percolation thresholds with
machine learning methods remains challenging. In this paper, we build a
powerful graph convolutional neural network to study the percolation in both
supervised and unsupervised ways. From a supervised learning perspective, the
graph convolutional neural network simultaneously and correctly trains data of
different lattice types, such as the square and triangular lattices. For the
unsupervised perspective, combining the graph convolutional neural network and
the confusion method, the percolation threshold can be obtained by the "W"
shaped performance. The finding of this work opens up the possibility of
building a more general framework that can probe the percolation-related
phenomenon.
Related papers
- A Survey on Graph Classification and Link Prediction based on GNN [11.614366568937761]
This review article delves into the world of graph convolutional neural networks.
It elaborates on the fundamentals of graph convolutional neural networks.
It elucidates the graph neural network models based on attention mechanisms and autoencoders.
arXiv Detail & Related papers (2023-07-03T09:08:01Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Learning on Arbitrary Graph Topologies via Predictive Coding [38.761663028090204]
We show how predictive coding can be used to perform inference and learning on arbitrary graph topologies.
We experimentally show how this formulation, called PC graphs, can be used to flexibly perform different tasks with the same network.
arXiv Detail & Related papers (2022-01-31T12:43:22Z) - Self-Supervised Graph Representation Learning for Neuronal Morphologies [75.38832711445421]
We present GraphDINO, a data-driven approach to learn low-dimensional representations of 3D neuronal morphologies from unlabeled datasets.
We show, in two different species and across multiple brain areas, that this method yields morphological cell type clusterings on par with manual feature-based classification by experts.
Our method could potentially enable data-driven discovery of novel morphological features and cell types in large-scale datasets.
arXiv Detail & Related papers (2021-12-23T12:17:47Z) - Graph Kernel Neural Networks [53.91024360329517]
We propose to use graph kernels, i.e. kernel functions that compute an inner product on graphs, to extend the standard convolution operator to the graph domain.
This allows us to define an entirely structural model that does not require computing the embedding of the input graph.
Our architecture allows to plug-in any type of graph kernels and has the added benefit of providing some interpretability.
arXiv Detail & Related papers (2021-12-14T14:48:08Z) - A Unifying Generative Model for Graph Learning Algorithms: Label
Propagation, Graph Convolutions, and Combinations [39.8498896531672]
Semi-supervised learning on graphs is a widely applicable problem in network science and machine learning.
We develop a Markov random field model for the data generation process of node attributes.
We show that label propagation, a linearized graph convolutional network, and their combination can all be derived as conditional expectations.
arXiv Detail & Related papers (2021-01-19T17:07:08Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z) - Convolutional Kernel Networks for Graph-Structured Data [37.13712126432493]
We introduce a family of multilayer graph kernels and establish new links between graph convolutional neural networks and kernel methods.
Our approach generalizes convolutional kernel networks to graph-structured data, by representing graphs as a sequence of kernel feature maps.
Our model can also be trained end-to-end on large-scale data, leading to new types of graph convolutional neural networks.
arXiv Detail & Related papers (2020-03-11T09:44:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.