Analyzing Neural Networks Based on Random Graphs
- URL: http://arxiv.org/abs/2002.08104v3
- Date: Wed, 2 Dec 2020 11:29:36 GMT
- Title: Analyzing Neural Networks Based on Random Graphs
- Authors: Romuald A. Janik and Aleksandra Nowak
- Abstract summary: We perform a massive evaluation of neural networks with architectures corresponding to random graphs of various types.
We find that none of the classical numerical graph invariants by itself allows to single out the best networks.
We also find that networks with primarily short-range connections perform better than networks which allow for many long-range connections.
- Score: 77.34726150561087
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We perform a massive evaluation of neural networks with architectures
corresponding to random graphs of various types. We investigate various
structural and numerical properties of the graphs in relation to neural network
test accuracy. We find that none of the classical numerical graph invariants by
itself allows to single out the best networks. Consequently, we introduce a new
numerical graph characteristic that selects a set of quasi-1-dimensional
graphs, which are a majority among the best performing networks. We also find
that networks with primarily short-range connections perform better than
networks which allow for many long-range connections. Moreover, many resolution
reducing pathways are beneficial. We provide a dataset of 1020 graphs and the
test accuracies of their corresponding neural networks at
https://github.com/rmldj/random-graph-nn-paper
Related papers
- Sum-Product-Set Networks: Deep Tractable Models for Tree-Structured Graphs [0.0]
We propose sum-product-set networks, an extension of probabilistic circuits from unstructured data to tree-structured graph data.
We demonstrate that our tractable model performs comparably to various intractable models based on neural networks.
arXiv Detail & Related papers (2024-08-14T09:13:27Z) - Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Improving Graph Neural Networks with Simple Architecture Design [7.057970273958933]
We introduce several key design strategies for graph neural networks.
We present a simple and shallow model, Feature Selection Graph Neural Network (FSGNN)
We show that the proposed model outperforms other state of the art GNN models and achieves up to 64% improvements in accuracy on node classification tasks.
arXiv Detail & Related papers (2021-05-17T06:46:01Z) - Natural Graph Networks [80.77570956520482]
We show that the more general concept of naturality is sufficient for a graph network to be well-defined.
We define global and local natural graph networks, the latter of which are as scalable as conventional message passing graph neural networks.
arXiv Detail & Related papers (2020-07-16T14:19:06Z) - Graph Structure of Neural Networks [104.33754950606298]
We show how the graph structure of neural networks affect their predictive performance.
A "sweet spot" of relational graphs leads to neural networks with significantly improved predictive performance.
Top-performing neural networks have graph structure surprisingly similar to those of real biological neural networks.
arXiv Detail & Related papers (2020-07-13T17:59:31Z) - How hard is to distinguish graphs with graph neural networks? [32.09819774228997]
This study derives hardness results for the classification variant of graph isomorphism in the message-passing model (MPNN)
MPNN encompasses the majority of graph neural networks used today and is universal when nodes are given unique features.
An empirical study involving 12 graph classification tasks and 420 networks reveals strong alignment between actual performance and theoretical predictions.
arXiv Detail & Related papers (2020-05-13T22:28:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.