Robust Continual Test-time Adaptation: Instance-aware BN and
Prediction-balanced Memory
- URL: http://arxiv.org/abs/2208.05117v1
- Date: Wed, 10 Aug 2022 03:05:46 GMT
- Title: Robust Continual Test-time Adaptation: Instance-aware BN and
Prediction-balanced Memory
- Authors: Taesik Gong, Jongheon Jeong, Taewon Kim, Yewon Kim, Jinwoo Shin,
Sung-Ju Lee
- Abstract summary: We present a new test-time adaptation scheme that is robust against non-i.i.d. test data streams.
Our novelty is mainly two-fold: (a) Instance-Aware Batch Normalization (IABN) that corrects normalization for out-of-distribution samples, and (b) Prediction-balanced Reservoir Sampling (PBRS) that simulates i.i.d. data stream from non-i.i.d. stream in a class-balanced manner.
- Score: 58.72445309519892
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Test-time adaptation (TTA) is an emerging paradigm that addresses
distributional shifts between training and testing phases without additional
data acquisition or labeling cost; only unlabeled test data streams are used
for continual model adaptation. Previous TTA schemes assume that the test
samples are independent and identically distributed (i.i.d.), even though they
are often temporally correlated (non-i.i.d.) in application scenarios, e.g.,
autonomous driving. We discover that most existing TTA methods fail
dramatically under such scenarios. Motivated by this, we present a new
test-time adaptation scheme that is robust against non-i.i.d. test data
streams. Our novelty is mainly two-fold: (a) Instance-Aware Batch Normalization
(IABN) that corrects normalization for out-of-distribution samples, and (b)
Prediction-balanced Reservoir Sampling (PBRS) that simulates i.i.d. data stream
from non-i.i.d. stream in a class-balanced manner. Our evaluation with various
datasets, including real-world non-i.i.d. streams, demonstrates that the
proposed robust TTA not only outperforms state-of-the-art TTA algorithms in the
non-i.i.d. setting, but also achieves comparable performance to those
algorithms under the i.i.d. assumption.
Related papers
- Distribution Alignment for Fully Test-Time Adaptation with Dynamic Online Data Streams [19.921480334048756]
Test-Time Adaptation (TTA) enables adaptation and inference in test data streams with domain shifts from the source.
We propose a novel Distribution Alignment loss for TTA.
We surpass existing methods in non-i.i.d. scenarios and maintain competitive performance under the ideal i.i.d. assumption.
arXiv Detail & Related papers (2024-07-16T19:33:23Z) - Active Test-Time Adaptation: Theoretical Analyses and An Algorithm [51.84691955495693]
Test-time adaptation (TTA) addresses distribution shifts for streaming test data in unsupervised settings.
We propose the novel problem setting of active test-time adaptation (ATTA) that integrates active learning within the fully TTA setting.
arXiv Detail & Related papers (2024-04-07T22:31:34Z) - Diversity-aware Buffer for Coping with Temporally Correlated Data
Streams in Online Test-time Adaptation [3.1265626879839923]
Test data streams are not always independent and identically distributed (i.i.d.)
We propose a diversity-aware and category-balanced buffer that can simulate an i.i.d. data stream, even in non-i.i.d. scenarios.
We achieve state-of-the-art results on most considered benchmarks.
arXiv Detail & Related papers (2024-01-02T01:56:25Z) - Persistent Test-time Adaptation in Recurring Testing Scenarios [12.024233973321756]
Current test-time adaptation (TTA) approaches aim to adapt a machine learning model to environments that change continuously.
Yet, it is unclear whether TTA methods can maintain their adaptability over prolonged periods.
We propose persistent TTA (PeTTA) which senses when the model is diverging towards collapse and adjusts the adaptation strategy.
arXiv Detail & Related papers (2023-11-30T02:24:44Z) - Generalized Robust Test-Time Adaptation in Continuous Dynamic Scenarios [18.527640606971563]
Test-time adaptation (TTA) adapts pre-trained models to test distributions during the inference phase exclusively employing unlabeled test data streams.
We propose a Generalized Robust Test-Time Adaptation (GRoTTA) method to effectively address the difficult problem.
arXiv Detail & Related papers (2023-10-07T07:13:49Z) - Robust Test-Time Adaptation in Dynamic Scenarios [9.475271284789969]
Test-time adaptation (TTA) intends to adapt the pretrained model to test distributions with only unlabeled test data streams.
We elaborate a Robust Test-Time Adaptation (RoTTA) method against the complex data stream in PTTA.
Our method is easy to implement, making it a good choice for rapid deployment.
arXiv Detail & Related papers (2023-03-24T10:19:14Z) - DELTA: degradation-free fully test-time adaptation [59.74287982885375]
We find that two unfavorable defects are concealed in the prevalent adaptation methodologies like test-time batch normalization (BN) and self-learning.
First, we reveal that the normalization statistics in test-time BN are completely affected by the currently received test samples, resulting in inaccurate estimates.
Second, we show that during test-time adaptation, the parameter update is biased towards some dominant classes.
arXiv Detail & Related papers (2023-01-30T15:54:00Z) - Sequential Kernelized Independence Testing [101.22966794822084]
We design sequential kernelized independence tests inspired by kernelized dependence measures.
We demonstrate the power of our approaches on both simulated and real data.
arXiv Detail & Related papers (2022-12-14T18:08:42Z) - CAFA: Class-Aware Feature Alignment for Test-Time Adaptation [50.26963784271912]
Test-time adaptation (TTA) aims to address this challenge by adapting a model to unlabeled data at test time.
We propose a simple yet effective feature alignment loss, termed as Class-Aware Feature Alignment (CAFA), which simultaneously encourages a model to learn target representations in a class-discriminative manner.
arXiv Detail & Related papers (2022-06-01T03:02:07Z) - Listen, Adapt, Better WER: Source-free Single-utterance Test-time
Adaptation for Automatic Speech Recognition [65.84978547406753]
Test-time Adaptation aims to adapt the model trained on source domains to yield better predictions for test samples.
Single-Utterance Test-time Adaptation (SUTA) is the first TTA study in speech area to our best knowledge.
arXiv Detail & Related papers (2022-03-27T06:38:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.