Revisiting Realistic Test-Time Training: Sequential Inference and
Adaptation by Anchored Clustering Regularized Self-Training
- URL: http://arxiv.org/abs/2303.10856v1
- Date: Mon, 20 Mar 2023 04:30:18 GMT
- Title: Revisiting Realistic Test-Time Training: Sequential Inference and
Adaptation by Anchored Clustering Regularized Self-Training
- Authors: Yongyi Su, Xun Xu, Tianrui Li, Kui Jia
- Abstract summary: We develop a test-time anchored clustering (TTAC) approach to enable stronger test-time feature learning.
Self-training(ST) has demonstrated great success in learning from unlabeled data.
TTAC++ consistently outperforms the state-of-the-art methods on five TTT datasets.
- Score: 37.75537703971045
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Deploying models on target domain data subject to distribution shift requires
adaptation. Test-time training (TTT) emerges as a solution to this adaptation
under a realistic scenario where access to full source domain data is not
available, and instant inference on the target domain is required. Despite many
efforts into TTT, there is a confusion over the experimental settings, thus
leading to unfair comparisons. In this work, we first revisit TTT assumptions
and categorize TTT protocols by two key factors. Among the multiple protocols,
we adopt a realistic sequential test-time training (sTTT) protocol, under which
we develop a test-time anchored clustering (TTAC) approach to enable stronger
test-time feature learning. TTAC discovers clusters in both source and target
domains and matches the target clusters to the source ones to improve
adaptation. When source domain information is strictly absent (i.e.
source-free) we further develop an efficient method to infer source domain
distributions for anchored clustering. Finally, self-training~(ST) has
demonstrated great success in learning from unlabeled data and we empirically
figure out that applying ST alone to TTT is prone to confirmation bias.
Therefore, a more effective TTT approach is introduced by regularizing
self-training with anchored clustering, and the improved model is referred to
as TTAC++. We demonstrate that, under all TTT protocols, TTAC++ consistently
outperforms the state-of-the-art methods on five TTT datasets, including
corrupted target domain, selected hard samples, synthetic-to-real adaptation
and adversarially attacked target domain. We hope this work will provide a fair
benchmarking of TTT methods, and future research should be compared within
respective protocols.
Related papers
- BoostAdapter: Improving Vision-Language Test-Time Adaptation via Regional Bootstrapping [64.8477128397529]
We propose a training-required and training-free test-time adaptation framework.
We maintain a light-weight key-value memory for feature retrieval from instance-agnostic historical samples and instance-aware boosting samples.
We theoretically justify the rationality behind our method and empirically verify its effectiveness on both the out-of-distribution and the cross-domain datasets.
arXiv Detail & Related papers (2024-10-20T15:58:43Z) - Enhancing Test Time Adaptation with Few-shot Guidance [35.13317598777832]
Deep neural networks often encounter significant performance drops while facing with domain shifts between training (source) and test (target) data.
Test Time Adaptation (TTA) methods have been proposed to adapt pre-trained source model to handle out-of-distribution streaming target data.
We develop Few-Shot Test Time Adaptation (FS-TTA), a novel and practical setting that utilizes a few-shot support set on top of TTA.
arXiv Detail & Related papers (2024-09-02T15:50:48Z) - UniTTA: Unified Benchmark and Versatile Framework Towards Realistic Test-Time Adaptation [66.05528698010697]
Test-Time Adaptation aims to adapt pre-trained models to the target domain during testing.
Researchers have identified various challenging scenarios and developed diverse methods to address these challenges.
We propose a Unified Test-Time Adaptation benchmark, which is comprehensive and widely applicable.
arXiv Detail & Related papers (2024-07-29T15:04:53Z) - Active Test-Time Adaptation: Theoretical Analyses and An Algorithm [51.84691955495693]
Test-time adaptation (TTA) addresses distribution shifts for streaming test data in unsupervised settings.
We propose the novel problem setting of active test-time adaptation (ATTA) that integrates active learning within the fully TTA setting.
arXiv Detail & Related papers (2024-04-07T22:31:34Z) - pSTarC: Pseudo Source Guided Target Clustering for Fully Test-Time
Adaptation [15.621092104244003]
Test Time Adaptation (TTA) is a pivotal concept in machine learning, enabling models to perform well in real-world scenarios.
We propose a novel approach called pseudo Source guided Target Clustering (pSTarC) addressing the relatively unexplored area of TTA under real-world domain shifts.
arXiv Detail & Related papers (2023-09-02T07:13:47Z) - Improved Test-Time Adaptation for Domain Generalization [48.239665441875374]
Test-time training (TTT) adapts the learned model with test data.
This work addresses two main factors: selecting an appropriate auxiliary TTT task for updating and identifying reliable parameters to update during the test phase.
We introduce additional adaptive parameters for the trained model, and we suggest only updating the adaptive parameters during the test phase.
arXiv Detail & Related papers (2023-04-10T10:12:38Z) - TeST: Test-time Self-Training under Distribution Shift [99.68465267994783]
Test-Time Self-Training (TeST) is a technique that takes as input a model trained on some source data and a novel data distribution at test time.
We find that models adapted using TeST significantly improve over baseline test-time adaptation algorithms.
arXiv Detail & Related papers (2022-09-23T07:47:33Z) - Revisiting Realistic Test-Time Training: Sequential Inference and
Adaptation by Anchored Clustering [37.76664203157892]
We develop a test-time anchored clustering (TTAC) approach to enable stronger test-time feature learning.
TTAC discovers clusters in both source and target domain and match the target clusters to the source ones to improve generalization.
We demonstrate that under all TTT protocols TTAC consistently outperforms the state-of-the-art methods on five TTT datasets.
arXiv Detail & Related papers (2022-06-06T16:23:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.