Convolutional Neural Network Dynamics: A Graph Perspective
- URL: http://arxiv.org/abs/2111.05410v1
- Date: Tue, 9 Nov 2021 20:38:48 GMT
- Title: Convolutional Neural Network Dynamics: A Graph Perspective
- Authors: Fatemeh Vahedian, Ruiyu Li, Puja Trivedi, Di Jin, Danai Koutra
- Abstract summary: We take a graph perspective and investigate the relationship between the graph structure of NNs and their performance.
For the dynamic graph representation of NNs, we explore structural representations for fully-connected and convolutional layers.
Our analysis shows that a simple summary of graph statistics can be used to accurately predict the performance of NNs.
- Score: 39.81881710355496
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The success of neural networks (NNs) in a wide range of applications has led
to increased interest in understanding the underlying learning dynamics of
these models. In this paper, we go beyond mere descriptions of the learning
dynamics by taking a graph perspective and investigating the relationship
between the graph structure of NNs and their performance. Specifically, we
propose (1) representing the neural network learning process as a time-evolving
graph (i.e., a series of static graph snapshots over epochs), (2) capturing the
structural changes of the NN during the training phase in a simple temporal
summary, and (3) leveraging the structural summary to predict the accuracy of
the underlying NN in a classification or regression task. For the dynamic graph
representation of NNs, we explore structural representations for
fully-connected and convolutional layers, which are key components of powerful
NN models. Our analysis shows that a simple summary of graph statistics, such
as weighted degree and eigenvector centrality, over just a few epochs can be
used to accurately predict the performance of NNs. For example, a weighted
degree-based summary of the time-evolving graph that is constructed based on 5
training epochs of the LeNet architecture achieves classification accuracy of
over 93%. Our findings are consistent for different NN architectures, including
LeNet, VGG, AlexNet and ResNet.
Related papers
- A survey of dynamic graph neural networks [26.162035361191805]
Graph neural networks (GNNs) have emerged as a powerful tool for effectively mining and learning from graph-structured data.
This paper provides a comprehensive review of the fundamental concepts, key techniques, and state-of-the-art dynamic GNN models.
arXiv Detail & Related papers (2024-04-28T15:07:48Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - Graph Neural Networks Provably Benefit from Structural Information: A
Feature Learning Perspective [53.999128831324576]
Graph neural networks (GNNs) have pioneered advancements in graph representation learning.
This study investigates the role of graph convolution within the context of feature learning theory.
arXiv Detail & Related papers (2023-06-24T10:21:11Z) - Semantic Graph Neural Network with Multi-measure Learning for
Semi-supervised Classification [5.000404730573809]
Graph Neural Networks (GNNs) have attracted increasing attention in recent years.
Recent studies have shown that GNNs are vulnerable to the complex underlying structure of the graph.
We propose a novel framework for semi-supervised classification.
arXiv Detail & Related papers (2022-12-04T06:17:11Z) - Graph-Time Convolutional Neural Networks: Architecture and Theoretical
Analysis [12.995632804090198]
We introduce Graph-Time Convolutional Neural Networks (GTCNNs) as principled architecture to aid learning.
The approach can work with any type of product graph and we also introduce a parametric graph to learn also the producttemporal coupling.
Extensive numerical results on benchmark corroborate our findings and show the GTCNN compares favorably with state-of-the-art solutions.
arXiv Detail & Related papers (2022-06-30T10:20:52Z) - Exploiting Spiking Dynamics with Spatial-temporal Feature Normalization
in Graph Learning [9.88508686848173]
Biological spiking neurons with intrinsic dynamics underlie the powerful representation and learning capabilities of the brain.
Despite recent tremendous progress in spiking neural networks (SNNs) for handling Euclidean-space tasks, it still remains challenging to exploit SNNs in processing non-Euclidean-space data.
Here we present a general spike-based modeling framework that enables the direct training of SNNs for graph learning.
arXiv Detail & Related papers (2021-06-30T11:20:16Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z) - Graph Structure of Neural Networks [104.33754950606298]
We show how the graph structure of neural networks affect their predictive performance.
A "sweet spot" of relational graphs leads to neural networks with significantly improved predictive performance.
Top-performing neural networks have graph structure surprisingly similar to those of real biological neural networks.
arXiv Detail & Related papers (2020-07-13T17:59:31Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.