Noise2Score3D:Unsupervised Tweedie's Approach for Point Cloud Denoising
- URL: http://arxiv.org/abs/2502.16826v2
- Date: Mon, 03 Mar 2025 03:09:49 GMT
- Title: Noise2Score3D:Unsupervised Tweedie's Approach for Point Cloud Denoising
- Authors: Xiangbin Wei,
- Abstract summary: Noise2Score3D learns the gradient of the underlying point cloud distribution directly from noisy data.<n>Our method performs inference in a single step, avoiding the iterative processes used in existing unsupervised methods.<n>We introduce Total Variation for Point Cloud, a criterion that allows for the estimation of unknown noise parameters.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Building on recent advances in Bayesian statistics and image denoising, we propose Noise2Score3D, a fully unsupervised framework for point cloud denoising that addresses the critical challenge of limited availability of clean data. Noise2Score3D learns the gradient of the underlying point cloud distribution directly from noisy data, eliminating the need for clean data during training. By leveraging Tweedie's formula, our method performs inference in a single step, avoiding the iterative processes used in existing unsupervised methods, thereby improving both performance and efficiency. Experimental results demonstrate that Noise2Score3D achieves state-of-the-art performance on standard benchmarks, outperforming other unsupervised methods in Chamfer distance and point-to-mesh metrics, and rivaling some supervised approaches. Furthermore, Noise2Score3D demonstrates strong generalization ability beyond training datasets. Additionally, we introduce Total Variation for Point Cloud, a criterion that allows for the estimation of unknown noise parameters, which further enhances the method's versatility and real-world utility.
Related papers
- Noise2Score3D: Tweedie's Approach for Unsupervised Point Cloud Denoising [0.0]
Noise2Score3D learns the score function of the underlying point cloud distribution directly from noisy data.
Our method performs denoising in a single step, avoiding the iterative processes used in existing unsupervised methods.
arXiv Detail & Related papers (2025-03-12T11:28:04Z) - SoftPatch: Unsupervised Anomaly Detection with Noisy Data [67.38948127630644]
This paper considers label-level noise in image sensory anomaly detection for the first time.
We propose a memory-based unsupervised AD method, SoftPatch, which efficiently denoises the data at the patch level.
Compared with existing methods, SoftPatch maintains a strong modeling ability of normal data and alleviates the overconfidence problem in coreset.
arXiv Detail & Related papers (2024-03-21T08:49:34Z) - Learning with Noisy Foundation Models [95.50968225050012]
This paper is the first work to comprehensively understand and analyze the nature of noise in pre-training datasets.
We propose a tuning method (NMTune) to affine the feature space to mitigate the malignant effect of noise and improve generalization.
arXiv Detail & Related papers (2024-03-11T16:22:41Z) - Fine tuning Pre trained Models for Robustness Under Noisy Labels [34.68018860186995]
The presence of noisy labels in a training dataset can significantly impact the performance of machine learning models.
We introduce a novel algorithm called TURN, which robustly and efficiently transfers the prior knowledge of pre-trained models.
arXiv Detail & Related papers (2023-10-24T20:28:59Z) - Understanding and Mitigating the Label Noise in Pre-training on
Downstream Tasks [91.15120211190519]
This paper aims to understand the nature of noise in pre-training datasets and to mitigate its impact on downstream tasks.
We propose a light-weight black-box tuning method (NMTune) to affine the feature space to mitigate the malignant effect of noise.
arXiv Detail & Related papers (2023-09-29T06:18:15Z) - Label Noise: Correcting the Forward-Correction [0.0]
Training neural network classifiers on datasets with label noise poses a risk of overfitting them to the noisy labels.
We propose an approach to tackling overfitting caused by label noise.
Motivated by this observation, we propose imposing a lower bound on the training loss to mitigate overfitting.
arXiv Detail & Related papers (2023-07-24T19:41:19Z) - Latent Class-Conditional Noise Model [54.56899309997246]
We introduce a Latent Class-Conditional Noise model (LCCN) to parameterize the noise transition under a Bayesian framework.
We then deduce a dynamic label regression method for LCCN, whose Gibbs sampler allows us efficiently infer the latent true labels.
Our approach safeguards the stable update of the noise transition, which avoids previous arbitrarily tuning from a mini-batch of samples.
arXiv Detail & Related papers (2023-02-19T15:24:37Z) - Confidence-based Reliable Learning under Dual Noises [46.45663546457154]
Deep neural networks (DNNs) have achieved remarkable success in a variety of computer vision tasks.
Yet, the data collected from the open world are unavoidably polluted by noise, which may significantly undermine the efficacy of the learned models.
Various attempts have been made to reliably train DNNs under data noise, but they separately account for either the noise existing in the labels or that existing in the images.
This work provides a first, unified framework for reliable learning under the joint (image, label)-noise.
arXiv Detail & Related papers (2023-02-10T07:50:34Z) - Identifying Hard Noise in Long-Tailed Sample Distribution [76.16113794808001]
We introduce Noisy Long-Tailed Classification (NLT)
Most de-noising methods fail to identify the hard noises.
We design an iterative noisy learning framework called Hard-to-Easy (H2E)
arXiv Detail & Related papers (2022-07-27T09:03:03Z) - PD-Flow: A Point Cloud Denoising Framework with Normalizing Flows [20.382995180671205]
Point cloud denoising aims to restore clean point clouds from raw observations corrupted by noise and outliers.
We present a novel deep learning-based denoising model, that incorporates normalizing flows and noise disentanglement techniques.
arXiv Detail & Related papers (2022-03-11T14:17:58Z) - Differentiable Manifold Reconstruction for Point Cloud Denoising [23.33652755967715]
3D point clouds are often perturbed by noise due to the inherent limitation of acquisition equipments.
We propose to learn the underlying manifold of a noisy point cloud from differentiably subsampled points.
We show that our method significantly outperforms state-of-the-art denoising methods under both synthetic noise and real world noise.
arXiv Detail & Related papers (2020-07-27T13:31:41Z) - Non-Local Part-Aware Point Cloud Denoising [55.50360085086123]
This paper presents a novel non-local part-aware deep neural network to denoise point clouds.
We design the non-local learning unit (NLU) customized with a graph attention module to adaptively capture non-local semantically-related features.
To enhance the denoising performance, we cascade a series of NLUs to progressively distill the noise features from the noisy inputs.
arXiv Detail & Related papers (2020-03-14T13:51:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.