Topological Uncertainty: Monitoring trained neural networks through
persistence of activation graphs
- URL: http://arxiv.org/abs/2105.04404v1
- Date: Fri, 7 May 2021 14:16:03 GMT
- Title: Topological Uncertainty: Monitoring trained neural networks through
persistence of activation graphs
- Authors: Th\'eo Lacombe (DATASHAPE), Yuichi Ike, Mathieu Carriere, Fr\'ed\'eric
Chazal, Marc Glisse, Yuhei Umeda
- Abstract summary: In industrial applications, data coming from an open-world setting might widely differ from the benchmark datasets on which a network was trained.
We develop a method to monitor trained neural networks based on the topological properties of their activation graphs.
- Score: 0.9786690381850356
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Although neural networks are capable of reaching astonishing performances on
a wide variety of contexts, properly training networks on complicated tasks
requires expertise and can be expensive from a computational perspective. In
industrial applications, data coming from an open-world setting might widely
differ from the benchmark datasets on which a network was trained. Being able
to monitor the presence of such variations without retraining the network is of
crucial importance. In this article, we develop a method to monitor trained
neural networks based on the topological properties of their activation graphs.
To each new observation, we assign a Topological Uncertainty, a score that aims
to assess the reliability of the predictions by investigating the whole network
instead of its final layer only, as typically done by practitioners. Our
approach entirely works at a post-training level and does not require any
assumption on the network architecture, optimization scheme, nor the use of
data augmentation or auxiliary datasets; and can be faithfully applied on a
large range of network architectures and data types. We showcase experimentally
the potential of Topological Uncertainty in the context of trained network
selection, Out-Of-Distribution detection, and shift-detection, both on
synthetic and real datasets of images and graphs.
Related papers
- Coding schemes in neural networks learning classification tasks [52.22978725954347]
We investigate fully-connected, wide neural networks learning classification tasks.
We show that the networks acquire strong, data-dependent features.
Surprisingly, the nature of the internal representations depends crucially on the neuronal nonlinearity.
arXiv Detail & Related papers (2024-06-24T14:50:05Z) - The Multiple Subnetwork Hypothesis: Enabling Multidomain Learning by
Isolating Task-Specific Subnetworks in Feedforward Neural Networks [0.0]
We identify a methodology and network representational structure which allows a pruned network to employ previously unused weights to learn subsequent tasks.
We show that networks trained using our approaches are able to learn multiple tasks, which may be related or unrelated, in parallel or in sequence without sacrificing performance on any task or exhibiting catastrophic forgetting.
arXiv Detail & Related papers (2022-07-18T15:07:13Z) - Quasi-orthogonality and intrinsic dimensions as measures of learning and
generalisation [55.80128181112308]
We show that dimensionality and quasi-orthogonality of neural networks' feature space may jointly serve as network's performance discriminants.
Our findings suggest important relationships between the networks' final performance and properties of their randomly initialised feature spaces.
arXiv Detail & Related papers (2022-03-30T21:47:32Z) - Network Embedding via Deep Prediction Model [25.727377978617465]
This paper proposes a network embedding framework to capture the transfer behaviors on structured networks via deep prediction models.
A network structure embedding layer is added into conventional deep prediction models, including Long Short-Term Memory Network and Recurrent Neural Network.
Experimental studies are conducted on various datasets including social networks, citation networks, biomedical network, collaboration network and language network.
arXiv Detail & Related papers (2021-04-27T16:56:00Z) - Joint Learning of Neural Transfer and Architecture Adaptation for Image
Recognition [77.95361323613147]
Current state-of-the-art visual recognition systems rely on pretraining a neural network on a large-scale dataset and finetuning the network weights on a smaller dataset.
In this work, we prove that dynamically adapting network architectures tailored for each domain task along with weight finetuning benefits in both efficiency and effectiveness.
Our method can be easily generalized to an unsupervised paradigm by replacing supernet training with self-supervised learning in the source domain tasks and performing linear evaluation in the downstream tasks.
arXiv Detail & Related papers (2021-03-31T08:15:17Z) - Anomaly Detection on Attributed Networks via Contrastive Self-Supervised
Learning [50.24174211654775]
We present a novel contrastive self-supervised learning framework for anomaly detection on attributed networks.
Our framework fully exploits the local information from network data by sampling a novel type of contrastive instance pair.
A graph neural network-based contrastive learning model is proposed to learn informative embedding from high-dimensional attributes and local structure.
arXiv Detail & Related papers (2021-02-27T03:17:20Z) - Neural networks adapting to datasets: learning network size and topology [77.34726150561087]
We introduce a flexible setup allowing for a neural network to learn both its size and topology during the course of a gradient-based training.
The resulting network has the structure of a graph tailored to the particular learning task and dataset.
arXiv Detail & Related papers (2020-06-22T12:46:44Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.