Logit Normalization for Long-tail Object Detection
- URL: http://arxiv.org/abs/2203.17020v1
- Date: Thu, 31 Mar 2022 13:28:51 GMT
- Title: Logit Normalization for Long-tail Object Detection
- Authors: Liang Zhao, Yao Teng, Limin Wang
- Abstract summary: Real-world data exhibiting skewed distributions pose a serious challenge to existing object detectors.
We propose Logit Normalization (LogN), a technique to self-calibrate the classified logits of detectors in a similar way to batch normalization.
In general, our LogN is training- and tuning-free (i.e. require no extra training and tuning process), model- and label distribution-agnostic, and also plug-and-play.
- Score: 32.18963619434191
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Real-world data exhibiting skewed distributions pose a serious challenge to
existing object detectors. Moreover, the samplers in detectors lead to shifted
training label distributions, while the tremendous proportion of background to
foreground samples severely harms foreground classification. To mitigate these
issues, in this paper, we propose Logit Normalization (LogN), a simple
technique to self-calibrate the classified logits of detectors in a similar way
to batch normalization. In general, our LogN is training- and tuning-free (i.e.
require no extra training and tuning process), model- and label
distribution-agnostic (i.e. generalization to different kinds of detectors and
datasets), and also plug-and-play (i.e. direct application without any bells
and whistles). Extensive experiments on the LVIS dataset demonstrate superior
performance of LogN to state-of-the-art methods with various detectors and
backbones. We also provide in-depth studies on different aspects of our LogN.
Further experiments on ImageNet-LT reveal its competitiveness and
generalizability. Our LogN can serve as a strong baseline for long-tail object
detection and is expected to inspire future research in this field. Code and
trained models will be publicly available at https://github.com/MCG-NJU/LogN.
Related papers
- PS-TTL: Prototype-based Soft-labels and Test-Time Learning for Few-shot Object Detection [21.443060372419286]
Few-Shot Object Detection (FSOD) has gained widespread attention and made significant progress.
We propose a new framework for FSOD, namely Prototype-based Soft-labels and Test-Time Learning (PS-TTL)
arXiv Detail & Related papers (2024-08-11T02:21:43Z) - GOODAT: Towards Test-time Graph Out-of-Distribution Detection [103.40396427724667]
Graph neural networks (GNNs) have found widespread application in modeling graph data across diverse domains.
Recent studies have explored graph OOD detection, often focusing on training a specific model or modifying the data on top of a well-trained GNN.
This paper introduces a data-centric, unsupervised, and plug-and-play solution that operates independently of training data and modifications of GNN architecture.
arXiv Detail & Related papers (2024-01-10T08:37:39Z) - How Low Can You Go? Surfacing Prototypical In-Distribution Samples for Unsupervised Anomaly Detection [48.30283806131551]
We show that UAD with extremely few training samples can already match -- and in some cases even surpass -- the performance of training with the whole training dataset.
We propose an unsupervised method to reliably identify prototypical samples to further boost UAD performance.
arXiv Detail & Related papers (2023-12-06T15:30:47Z) - AnoShift: A Distribution Shift Benchmark for Unsupervised Anomaly
Detection [7.829710051617368]
We introduce an unsupervised anomaly detection benchmark with data that shifts over time, built over Kyoto-2006+, a traffic dataset for network intrusion detection.
We first highlight the non-stationary nature of the data, using a basic per-feature analysis, t-SNE, and an Optimal Transport approach for measuring the overall distribution distances between years.
We validate the performance degradation over time with diverse models, ranging from classical approaches to deep learning.
arXiv Detail & Related papers (2022-06-30T17:59:22Z) - Attentive Prototypes for Source-free Unsupervised Domain Adaptive 3D
Object Detection [85.11649974840758]
3D object detection networks tend to be biased towards the data they are trained on.
We propose a single-frame approach for source-free, unsupervised domain adaptation of lidar-based 3D object detectors.
arXiv Detail & Related papers (2021-11-30T18:42:42Z) - DAAIN: Detection of Anomalous and Adversarial Input using Normalizing
Flows [52.31831255787147]
We introduce a novel technique, DAAIN, to detect out-of-distribution (OOD) inputs and adversarial attacks (AA)
Our approach monitors the inner workings of a neural network and learns a density estimator of the activation distribution.
Our model can be trained on a single GPU making it compute efficient and deployable without requiring specialized accelerators.
arXiv Detail & Related papers (2021-05-30T22:07:13Z) - SSD: A Unified Framework for Self-Supervised Outlier Detection [37.254114112911786]
We propose an outlier detector based on only unlabeled in-distribution data.
We use self-supervised representation learning followed by a Mahalanobis distance based detection.
We extend our framework to incorporate training data labels, if available.
arXiv Detail & Related papers (2021-03-22T17:51:35Z) - Self-Attentive Classification-Based Anomaly Detection in Unstructured
Logs [59.04636530383049]
We propose Logsy, a classification-based method to learn log representations.
We show an average improvement of 0.25 in the F1 score, compared to the previous methods.
arXiv Detail & Related papers (2020-08-21T07:26:55Z) - UniT: Unified Knowledge Transfer for Any-shot Object Detection and
Segmentation [52.487469544343305]
Methods for object detection and segmentation rely on large scale instance-level annotations for training.
We propose an intuitive and unified semi-supervised model that is applicable to a range of supervision.
arXiv Detail & Related papers (2020-06-12T22:45:47Z) - Incremental Few-Shot Object Detection [96.02543873402813]
OpeN-ended Centre nEt is a detector for incrementally learning to detect class objects with few examples.
ONCE fully respects the incremental learning paradigm, with novel class registration requiring only a single forward pass of few-shot training samples.
arXiv Detail & Related papers (2020-03-10T12:56:59Z) - Anomaly Detection by One Class Latent Regularized Networks [36.67420338535258]
Semi-supervised Generative Adversarial Networks (GAN)-based methods have been gaining popularity in anomaly detection task recently.
A novel adversarial dual autoencoder network is proposed, in which the underlying structure of training data is captured in latent feature space.
Experiments show that our model achieves the state-of-the-art results on MNIST and CIFAR10 datasets as well as GTSRB stop signs dataset.
arXiv Detail & Related papers (2020-02-05T02:21:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.