Into the Unknown: Active Monitoring of Neural Networks
- URL: http://arxiv.org/abs/2009.06429v4
- Date: Fri, 12 Nov 2021 14:40:39 GMT
- Title: Into the Unknown: Active Monitoring of Neural Networks
- Authors: Anna Lukina, Christian Schilling, Thomas A. Henzinger
- Abstract summary: We introduce an algorithmic framework for active monitoring of a neural network.
A monitor wrapped in our framework operates in parallel with the neural network and interacts with a human user.
An experimental evaluation on a diverse set of benchmarks confirms the benefits of our active monitoring framework in dynamic scenarios.
- Score: 9.591060426695748
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural-network classifiers achieve high accuracy when predicting the class of
an input that they were trained to identify. Maintaining this accuracy in
dynamic environments, where inputs frequently fall outside the fixed set of
initially known classes, remains a challenge. The typical approach is to detect
inputs from novel classes and retrain the classifier on an augmented dataset.
However, not only the classifier but also the detection mechanism needs to
adapt in order to distinguish between newly learned and yet unknown input
classes. To address this challenge, we introduce an algorithmic framework for
active monitoring of a neural network. A monitor wrapped in our framework
operates in parallel with the neural network and interacts with a human user
via a series of interpretable labeling queries for incremental adaptation. In
addition, we propose an adaptive quantitative monitor to improve precision. An
experimental evaluation on a diverse set of benchmarks with varying numbers of
classes confirms the benefits of our active monitoring framework in dynamic
scenarios.
Related papers
- Activate and Reject: Towards Safe Domain Generalization under Category
Shift [71.95548187205736]
We study a practical problem of Domain Generalization under Category Shift (DGCS)
It aims to simultaneously detect unknown-class samples and classify known-class samples in the target domains.
Compared to prior DG works, we face two new challenges: 1) how to learn the concept of unknown'' during training with only source known-class samples, and 2) how to adapt the source-trained model to unseen environments.
arXiv Detail & Related papers (2023-10-07T07:53:12Z) - Adversarial Sample Detection Through Neural Network Transport Dynamics [18.08752807817708]
We propose a detector of adversarial samples based on the view of neural networks as discrete dynamic systems.
The detector tells clean inputs from abnormal ones by comparing the discrete vector fields they follow through the layers.
We show that regularizing this vector field during training makes the network more regular on the data distribution's support.
arXiv Detail & Related papers (2023-06-07T08:47:41Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Anomaly Detection using Ensemble Classification and Evidence Theory [62.997667081978825]
We present a novel approach for novel detection using ensemble classification and evidence theory.
A pool selection strategy is presented to build a solid ensemble classifier.
We use uncertainty for the anomaly detection approach.
arXiv Detail & Related papers (2022-12-23T00:50:41Z) - Automatic Change-Point Detection in Time Series via Deep Learning [8.43086628139493]
We show how to automatically generate new offline detection methods based on training a neural network.
We present theory that quantifies the error rate for such an approach, and how it depends on the amount of training data.
Our method also shows strong results in detecting and localising changes in activity based on accelerometer data.
arXiv Detail & Related papers (2022-11-07T20:59:14Z) - Self-supervised Pretraining with Classification Labels for Temporal
Activity Detection [54.366236719520565]
Temporal Activity Detection aims to predict activity classes per frame.
Due to the expensive frame-level annotations required for detection, the scale of detection datasets is limited.
This work proposes a novel self-supervised pretraining method for detection leveraging classification labels.
arXiv Detail & Related papers (2021-11-26T18:59:28Z) - Incremental Deep Neural Network Learning using Classification Confidence
Thresholding [4.061135251278187]
Most modern neural networks for classification fail to take into account the concept of the unknown.
This paper proposes the Classification Confidence Threshold approach to prime neural networks for incremental learning.
arXiv Detail & Related papers (2021-06-21T22:46:28Z) - DAAIN: Detection of Anomalous and Adversarial Input using Normalizing
Flows [52.31831255787147]
We introduce a novel technique, DAAIN, to detect out-of-distribution (OOD) inputs and adversarial attacks (AA)
Our approach monitors the inner workings of a neural network and learns a density estimator of the activation distribution.
Our model can be trained on a single GPU making it compute efficient and deployable without requiring specialized accelerators.
arXiv Detail & Related papers (2021-05-30T22:07:13Z) - Conditional Variational Capsule Network for Open Set Recognition [64.18600886936557]
In open set recognition, a classifier has to detect unknown classes that are not known at training time.
Recently proposed Capsule Networks have shown to outperform alternatives in many fields, particularly in image recognition.
In our proposal, during training, capsules features of the same known class are encouraged to match a pre-defined gaussian, one for each class.
arXiv Detail & Related papers (2021-04-19T09:39:30Z) - Improving Video Instance Segmentation by Light-weight Temporal
Uncertainty Estimates [11.580916951856256]
We present a time-dynamic approach to model uncertainties of instance segmentation networks.
We apply this approach to the detection of false positives and the estimation of prediction quality.
The proposed method only requires a readily trained neural network and video sequence input.
arXiv Detail & Related papers (2020-12-14T13:39:05Z) - Deep Active Learning in Remote Sensing for data efficient Change
Detection [26.136331738529243]
We investigate active learning in the context of deep neural network models for change detection and map updating.
In active learning, one starts from a minimal set of training examples and progressively chooses informative samples annotated by a user.
We show that active learning successfully finds highly informative samples and automatically balances the training distribution.
arXiv Detail & Related papers (2020-08-25T17:58:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.