Subspace Clustering Based Analysis of Neural Networks
- URL: http://arxiv.org/abs/2107.01296v1
- Date: Fri, 2 Jul 2021 22:46:40 GMT
- Title: Subspace Clustering Based Analysis of Neural Networks
- Authors: Uday Singh Saini, Pravallika Devineni, Evangelos E. Papalexakis
- Abstract summary: We learn affinity graphs from the latent structure of a given neural network layer trained over a set of inputs.
We then use tools from Community Detection to quantify structures present in the input.
We analyze the learned affinity graphs of the final convolutional layer of the network and demonstrate how an input's local neighbourhood affects its classification by the network.
- Score: 7.451579925406617
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Tools to analyze the latent space of deep neural networks provide a step
towards better understanding them. In this work, we motivate sparse subspace
clustering (SSC) with an aim to learn affinity graphs from the latent structure
of a given neural network layer trained over a set of inputs. We then use tools
from Community Detection to quantify structures present in the input. These
experiments reveal that as we go deeper in a network, inputs tend to have an
increasing affinity to other inputs of the same class. Subsequently, we utilise
matrix similarity measures to perform layer-wise comparisons between affinity
graphs. In doing so we first demonstrate that when comparing a given layer
currently under training to its final state, the shallower the layer of the
network, the quicker it is to converge than the deeper layers. When performing
a pairwise analysis of the entire network architecture, we observe that, as the
network increases in size, it reorganises from a state where each layer is
moderately similar to its neighbours, to a state where layers within a block
have high similarity than to layers in other blocks. Finally, we analyze the
learned affinity graphs of the final convolutional layer of the network and
demonstrate how an input's local neighbourhood affects its classification by
the network.
Related papers
- Neural Collapse in the Intermediate Hidden Layers of Classification
Neural Networks [0.0]
(NC) gives a precise description of the representations of classes in the final hidden layer of classification neural networks.
In the present paper, we provide the first comprehensive empirical analysis of the emergence of (NC) in the intermediate hidden layers.
arXiv Detail & Related papers (2023-08-05T01:19:38Z) - Rank Diminishing in Deep Neural Networks [71.03777954670323]
Rank of neural networks measures information flowing across layers.
It is an instance of a key structural condition that applies across broad domains of machine learning.
For neural networks, however, the intrinsic mechanism that yields low-rank structures remains vague and unclear.
arXiv Detail & Related papers (2022-06-13T12:03:32Z) - Similarity and Matching of Neural Network Representations [0.0]
We employ a toolset -- dubbed Dr. Frankenstein -- to analyse the similarity of representations in deep neural networks.
We aim to match the activations on given layers of two trained neural networks by joining them with a stitching layer.
arXiv Detail & Related papers (2021-10-27T17:59:46Z) - Dive into Layers: Neural Network Capacity Bounding using Algebraic
Geometry [55.57953219617467]
We show that the learnability of a neural network is directly related to its size.
We use Betti numbers to measure the topological geometric complexity of input data and the neural network.
We perform the experiments on a real-world dataset MNIST and the results verify our analysis and conclusion.
arXiv Detail & Related papers (2021-09-03T11:45:51Z) - Hierarchical Graph Neural Networks [0.0]
This paper aims to connect the dots between the traditional Neural Network and the Graph Neural Network architectures.
A Hierarchical Graph Neural Network architecture is proposed, supplementing the original input network layer with the hierarchy of auxiliary network layers.
It enables simultaneous learning of the individual node features along with the aggregated network features at variable resolution and uses them to improve the convergence and stability of the individual node feature learning.
arXiv Detail & Related papers (2021-05-07T16:47:18Z) - Dynamic Graph: Learning Instance-aware Connectivity for Neural Networks [78.65792427542672]
Dynamic Graph Network (DG-Net) is a complete directed acyclic graph, where the nodes represent convolutional blocks and the edges represent connection paths.
Instead of using the same path of the network, DG-Net aggregates features dynamically in each node, which allows the network to have more representation ability.
arXiv Detail & Related papers (2020-10-02T16:50:26Z) - On the use of local structural properties for improving the efficiency
of hierarchical community detection methods [77.34726150561087]
We study how local structural network properties can be used as proxies to improve the efficiency of hierarchical community detection.
We also check the performance impact of network prunings as an ancillary tactic to make hierarchical community detection more efficient.
arXiv Detail & Related papers (2020-09-15T00:16:12Z) - Learning Connectivity of Neural Networks from a Topological Perspective [80.35103711638548]
We propose a topological perspective to represent a network into a complete graph for analysis.
By assigning learnable parameters to the edges which reflect the magnitude of connections, the learning process can be performed in a differentiable manner.
This learning process is compatible with existing networks and owns adaptability to larger search spaces and different tasks.
arXiv Detail & Related papers (2020-08-19T04:53:31Z) - Locality Guided Neural Networks for Explainable Artificial Intelligence [12.435539489388708]
We propose a novel algorithm for back propagation, called Locality Guided Neural Network(LGNN)
LGNN preserves locality between neighbouring neurons within each layer of a deep network.
In our experiments, we train various VGG and Wide ResNet (WRN) networks for image classification on CIFAR100.
arXiv Detail & Related papers (2020-07-12T23:45:51Z) - A Multiscale Graph Convolutional Network Using Hierarchical Clustering [0.0]
A novel architecture is explored which exploits this information through a multiscale decomposition.
A dendrogram is produced by a Girvan-Newman hierarchical clustering algorithm.
The architecture is tested on a benchmark citation network, demonstrating competitive performance.
arXiv Detail & Related papers (2020-06-22T18:13:03Z) - Fitting the Search Space of Weight-sharing NAS with Graph Convolutional
Networks [100.14670789581811]
We train a graph convolutional network to fit the performance of sampled sub-networks.
With this strategy, we achieve a higher rank correlation coefficient in the selected set of candidates.
arXiv Detail & Related papers (2020-04-17T19:12:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.