PC-Adapter: Topology-Aware Adapter for Efficient Domain Adaption on
Point Clouds with Rectified Pseudo-label
- URL: http://arxiv.org/abs/2309.16936v1
- Date: Fri, 29 Sep 2023 02:32:01 GMT
- Title: PC-Adapter: Topology-Aware Adapter for Efficient Domain Adaption on
Point Clouds with Rectified Pseudo-label
- Authors: Joonhyung Park, Hyunjin Seo, Eunho Yang
- Abstract summary: We revisit the unique challenges of point cloud data under domain shift scenarios.
We propose an adapter-guided domain adaptation method, PC-Adapter.
Our method demonstrates superiority over baselines on various domain shift settings in benchmark datasets.
- Score: 39.6711648793659
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Understanding point clouds captured from the real-world is challenging due to
shifts in data distribution caused by varying object scales, sensor angles, and
self-occlusion. Prior works have addressed this issue by combining recent
learning principles such as self-supervised learning, self-training, and
adversarial training, which leads to significant computational overhead.Toward
succinct yet powerful domain adaptation for point clouds, we revisit the unique
challenges of point cloud data under domain shift scenarios and discover the
importance of the global geometry of source data and trends of target
pseudo-labels biased to the source label distribution. Motivated by our
observations, we propose an adapter-guided domain adaptation method,
PC-Adapter, that preserves the global shape information of the source domain
using an attention-based adapter, while learning the local characteristics of
the target domain via another adapter equipped with graph convolution.
Additionally, we propose a novel pseudo-labeling strategy resilient to the
classifier bias by adjusting confidence scores using their class-wise
confidence distributions to consider relative confidences. Our method
demonstrates superiority over baselines on various domain shift settings in
benchmark datasets - PointDA, GraspNetPC, and PointSegDA.
Related papers
- TransAdapter: Vision Transformer for Feature-Centric Unsupervised Domain Adaptation [0.3277163122167433]
Unsupervised Domain Adaptation (UDA) aims to utilize labeled data from a source domain to solve tasks in an unlabeled target domain.
Traditional CNN-based methods struggle to fully capture complex domain relationships.
We propose a novel UDA approach leveraging the Swin Transformer with three key modules.
arXiv Detail & Related papers (2024-12-05T11:11:39Z) - Progressive Conservative Adaptation for Evolving Target Domains [76.9274842289221]
Conventional domain adaptation typically transfers knowledge from a source domain to a stationary target domain.
Restoring and adapting to such target data results in escalating computational and resource consumption over time.
We propose a simple yet effective approach, termed progressive conservative adaptation (PCAda)
arXiv Detail & Related papers (2024-02-07T04:11:25Z) - Unsupervised Domain Adaptation via Distilled Discriminative Clustering [45.39542287480395]
We re-cast the domain adaptation problem as discriminative clustering of target data.
We propose to jointly train the network using parallel, supervised learning objectives over labeled source data.
We conduct careful ablation studies and extensive experiments on five popular benchmark datasets.
arXiv Detail & Related papers (2023-02-23T13:03:48Z) - Divide and Contrast: Source-free Domain Adaptation via Adaptive
Contrastive Learning [122.62311703151215]
Divide and Contrast (DaC) aims to connect the good ends of both worlds while bypassing their limitations.
DaC divides the target data into source-like and target-specific samples, where either group of samples is treated with tailored goals.
We further align the source-like domain with the target-specific samples using a memory bank-based Maximum Mean Discrepancy (MMD) loss to reduce the distribution mismatch.
arXiv Detail & Related papers (2022-11-12T09:21:49Z) - Instance Relation Graph Guided Source-Free Domain Adaptive Object
Detection [79.89082006155135]
Unsupervised Domain Adaptation (UDA) is an effective approach to tackle the issue of domain shift.
UDA methods try to align the source and target representations to improve the generalization on the target domain.
The Source-Free Adaptation Domain (SFDA) setting aims to alleviate these concerns by adapting a source-trained model for the target domain without requiring access to the source data.
arXiv Detail & Related papers (2022-03-29T17:50:43Z) - Instance Level Affinity-Based Transfer for Unsupervised Domain
Adaptation [74.71931918541748]
We propose an instance affinity based criterion for source to target transfer during adaptation, called ILA-DA.
We first propose a reliable and efficient method to extract similar and dissimilar samples across source and target, and utilize a multi-sample contrastive loss to drive the domain alignment process.
We verify the effectiveness of ILA-DA by observing consistent improvements in accuracy over popular domain adaptation approaches on a variety of benchmark datasets.
arXiv Detail & Related papers (2021-04-03T01:33:14Z) - Supervised Domain Adaptation using Graph Embedding [86.3361797111839]
Domain adaptation methods assume that distributions between the two domains are shifted and attempt to realign them.
We propose a generic framework based on graph embedding.
We show that the proposed approach leads to a powerful Domain Adaptation framework.
arXiv Detail & Related papers (2020-03-09T12:25:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.