One-vs-Rest Network-based Deep Probability Model for Open Set
Recognition
- URL: http://arxiv.org/abs/2004.08067v2
- Date: Thu, 28 May 2020 06:37:47 GMT
- Title: One-vs-Rest Network-based Deep Probability Model for Open Set
Recognition
- Authors: Jaeyeon Jang and Chang Ouk Kim
- Abstract summary: An intelligent self-learning system should be able to differentiate between known and unknown examples.
One-vs-rest networks can provide more informative hidden representations for unknown examples than the commonly used SoftMax layer.
The proposed probability model outperformed the state-of-the art methods in open set classification scenarios.
- Score: 6.85316573653194
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Unknown examples that are unseen during training often appear in real-world
computer vision tasks, and an intelligent self-learning system should be able
to differentiate between known and unknown examples. Open set recognition,
which addresses this problem, has been studied for approximately a decade.
However, conventional open set recognition methods based on deep neural
networks (DNNs) lack a foundation for post recognition score analysis. In this
paper, we propose a DNN structure in which multiple one-vs-rest sigmoid
networks follow a convolutional neural network feature extractor. A one-vs-rest
network, which is composed of rectified linear unit activation functions for
the hidden layers and a single sigmoid target class output node, can maximize
the ability to learn information from nonmatch examples. Furthermore, the
network yields a sophisticated nonlinear features-to-output mapping that is
explainable in the feature space. By introducing extreme value theory-based
calibration techniques, the nonlinear and explainable mapping provides a
well-grounded class membership probability models. Our experiments show that
one-vs-rest networks can provide more informative hidden representations for
unknown examples than the commonly used SoftMax layer. In addition, the
proposed probability model outperformed the state-of-the art methods in open
set classification scenarios.
Related papers
- Coding schemes in neural networks learning classification tasks [52.22978725954347]
We investigate fully-connected, wide neural networks learning classification tasks.
We show that the networks acquire strong, data-dependent features.
Surprisingly, the nature of the internal representations depends crucially on the neuronal nonlinearity.
arXiv Detail & Related papers (2024-06-24T14:50:05Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Opening Deep Neural Networks with Generative Models [2.0962464943252934]
We propose GeMOS: simple and plug-and-play open set recognition modules that can be attached to pretrained Deep Neural Networks for visual recognition.
The GeMOS framework pairs pre-trained Convolutional Neural Networks with generative models for open set recognition to extract open set scores for each sample.
We conduct a thorough evaluation of the proposed method in comparison with state-of-the-art open set algorithms, finding that GeMOS either outperforms or is statistically indistinguishable from more complex and costly models.
arXiv Detail & Related papers (2021-05-20T20:02:29Z) - Open-set Recognition based on the Combination of Deep Learning and
Ensemble Method for Detecting Unknown Traffic Scenarios [0.9711326718689492]
This work proposes a combination of Convolutional Neural Networks (CNN) and Random Forest (RF) for open set recognition of traffic scenarios.
By inheriting the ensemble nature of RF, the vote pattern of all trees combined with extreme value theory is shown to be well suited for detecting unknown classes.
arXiv Detail & Related papers (2021-05-17T06:48:15Z) - Conditional Variational Capsule Network for Open Set Recognition [64.18600886936557]
In open set recognition, a classifier has to detect unknown classes that are not known at training time.
Recently proposed Capsule Networks have shown to outperform alternatives in many fields, particularly in image recognition.
In our proposal, during training, capsules features of the same known class are encouraged to match a pre-defined gaussian, one for each class.
arXiv Detail & Related papers (2021-04-19T09:39:30Z) - Collective Decision of One-vs-Rest Networks for Open Set Recognition [0.0]
We propose a simple open set recognition (OSR) method based on the intuition that OSR performance can be maximized by setting strict and sophisticated decision boundaries.
The proposed method performed significantly better than the state-of-the-art methods by effectively reducing overgeneralization.
arXiv Detail & Related papers (2021-03-18T13:06:46Z) - Anomaly Detection on Attributed Networks via Contrastive Self-Supervised
Learning [50.24174211654775]
We present a novel contrastive self-supervised learning framework for anomaly detection on attributed networks.
Our framework fully exploits the local information from network data by sampling a novel type of contrastive instance pair.
A graph neural network-based contrastive learning model is proposed to learn informative embedding from high-dimensional attributes and local structure.
arXiv Detail & Related papers (2021-02-27T03:17:20Z) - How Neural Networks Extrapolate: From Feedforward to Graph Neural
Networks [80.55378250013496]
We study how neural networks trained by gradient descent extrapolate what they learn outside the support of the training distribution.
Graph Neural Networks (GNNs) have shown some success in more complex tasks.
arXiv Detail & Related papers (2020-09-24T17:48:59Z) - Open Set Recognition with Conditional Probabilistic Generative Models [51.40872765917125]
We propose Conditional Probabilistic Generative Models (CPGM) for open set recognition.
CPGM can detect unknown samples but also classify known classes by forcing different latent features to approximate conditional Gaussian distributions.
Experiment results on multiple benchmark datasets reveal that the proposed method significantly outperforms the baselines.
arXiv Detail & Related papers (2020-08-12T06:23:49Z) - Conditional Gaussian Distribution Learning for Open Set Recognition [10.90687687505665]
We propose Conditional Gaussian Distribution Learning (CGDL) for open set recognition.
In addition to detecting unknown samples, this method can also classify known samples by forcing different latent features to approximate different Gaussian models.
Experiments on several standard image reveal that the proposed method significantly outperforms the baseline method and achieves new state-of-the-art results.
arXiv Detail & Related papers (2020-03-19T14:32:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.