Inter-layer Information Similarity Assessment of Deep Neural Networks
Via Topological Similarity and Persistence Analysis of Data Neighbour
Dynamics
- URL: http://arxiv.org/abs/2012.03793v1
- Date: Mon, 7 Dec 2020 15:34:58 GMT
- Title: Inter-layer Information Similarity Assessment of Deep Neural Networks
Via Topological Similarity and Persistence Analysis of Data Neighbour
Dynamics
- Authors: Andrew Hryniowski and Alexander Wong
- Abstract summary: The quantitative analysis of information structure through a deep neural network (DNN) can unveil new insights into the theoretical performance of DNN architectures.
Inspired by both LS and ID strategies for quantitative information structure analysis, we introduce two novel complimentary methods for inter-layer information similarity assessment.
We demonstrate their efficacy in this study by performing analysis on a deep convolutional neural network architecture on image data.
- Score: 93.4221402881609
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The quantitative analysis of information structure through a deep neural
network (DNN) can unveil new insights into the theoretical performance of DNN
architectures. Two very promising avenues of research towards quantitative
information structure analysis are: 1) layer similarity (LS) strategies focused
on the inter-layer feature similarity, and 2) intrinsic dimensionality (ID)
strategies focused on layer-wise data dimensionality using pairwise
information. Inspired by both LS and ID strategies for quantitative information
structure analysis, we introduce two novel complimentary methods for
inter-layer information similarity assessment premised on the interesting idea
of studying a data sample's neighbourhood dynamics as it traverses through a
DNN. More specifically, we introduce the concept of Nearest Neighbour
Topological Similarity (NNTS) for quantifying the information topology
similarity between layers of a DNN. Furthermore, we introduce the concept of
Nearest Neighbour Topological Persistence (NNTP) for quantifying the
inter-layer persistence of data neighbourhood relationships throughout a DNN.
The proposed strategies facilitate the efficient inter-layer information
similarity assessment by leveraging only local topological information, and we
demonstrate their efficacy in this study by performing analysis on a deep
convolutional neural network architecture on image data to study the insights
that can be gained with respect to the theoretical performance of a DNN.
Related papers
- Deep Neural Networks via Complex Network Theory: a Perspective [3.1023851130450684]
Deep Neural Networks (DNNs) can be represented as graphs whose links and vertices iteratively process data and solve tasks sub-optimally. Complex Network Theory (CNT), merging statistical physics with graph theory, provides a method for interpreting neural networks by analysing their weights and neuron structures.
In this work, we extend the existing CNT metrics with measures that sample from the DNNs' training distribution, shifting from a purely topological analysis to one that connects with the interpretability of deep learning.
arXiv Detail & Related papers (2024-04-17T08:42:42Z) - Topological Data Analysis for Neural Network Analysis: A Comprehensive
Survey [35.29334376503123]
This survey provides a comprehensive exploration of applications of Topological Data Analysis (TDA) within neural network analysis.
We discuss different strategies to obtain topological information from data and neural networks by means of TDA.
We explore practical implications of deep learning, specifically focusing on areas like adversarial detection and model selection.
arXiv Detail & Related papers (2023-12-10T09:50:57Z) - Deep Learning-based Analysis of Basins of Attraction [49.812879456944984]
This research addresses the challenge of characterizing the complexity and unpredictability of basins within various dynamical systems.
The main focus is on demonstrating the efficiency of convolutional neural networks (CNNs) in this field.
arXiv Detail & Related papers (2023-09-27T15:41:12Z) - Deep neural networks architectures from the perspective of manifold
learning [0.0]
This paper is a comprehensive comparison and description of neural network architectures in terms of ge-ometry and topology.
We focus on the internal representation of neural networks and on the dynamics of changes in the topology and geometry of a data manifold on different layers.
arXiv Detail & Related papers (2023-06-06T04:57:39Z) - Experimental Observations of the Topology of Convolutional Neural
Network Activations [2.4235626091331737]
Topological data analysis provides compact, noise-robust representations of complex structures.
Deep neural networks (DNNs) learn millions of parameters associated with a series of transformations defined by the model architecture.
In this paper, we apply cutting edge techniques from TDA with the goal of gaining insight into the interpretability of convolutional neural networks used for image classification.
arXiv Detail & Related papers (2022-12-01T02:05:44Z) - Knowledge Enhanced Neural Networks for relational domains [83.9217787335878]
We focus on a specific method, KENN, a Neural-Symbolic architecture that injects prior logical knowledge into a neural network.
In this paper, we propose an extension of KENN for relational data.
arXiv Detail & Related papers (2022-05-31T13:00:34Z) - Deep Architecture Connectivity Matters for Its Convergence: A
Fine-Grained Analysis [94.64007376939735]
We theoretically characterize the impact of connectivity patterns on the convergence of deep neural networks (DNNs) under gradient descent training.
We show that by a simple filtration on "unpromising" connectivity patterns, we can trim down the number of models to evaluate.
arXiv Detail & Related papers (2022-05-11T17:43:54Z) - Activation Landscapes as a Topological Summary of Neural Network
Performance [0.0]
We study how data transforms as it passes through successive layers of a deep neural network (DNN)
We compute the persistent homology of the activation data for each layer of the network and summarize this information using persistence landscapes.
The resulting feature map provides both an informative visual- ization of the network and a kernel for statistical analysis and machine learning.
arXiv Detail & Related papers (2021-10-19T17:45:36Z) - Fusing the Old with the New: Learning Relative Camera Pose with
Geometry-Guided Uncertainty [91.0564497403256]
We present a novel framework that involves probabilistic fusion between the two families of predictions during network training.
Our network features a self-attention graph neural network, which drives the learning by enforcing strong interactions between different correspondences.
We propose motion parmeterizations suitable for learning and show that our method achieves state-of-the-art performance on the challenging DeMoN and ScanNet datasets.
arXiv Detail & Related papers (2021-04-16T17:59:06Z) - PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive
Learning [109.84770951839289]
We present PredRNN, a new recurrent network for learning visual dynamics from historical context.
We show that our approach obtains highly competitive results on three standard datasets.
arXiv Detail & Related papers (2021-03-17T08:28:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.