IterativePFN: True Iterative Point Cloud Filtering
- URL: http://arxiv.org/abs/2304.01529v1
- Date: Tue, 4 Apr 2023 04:47:44 GMT
- Title: IterativePFN: True Iterative Point Cloud Filtering
- Authors: Dasith de Silva Edirimuni, Xuequan Lu, Zhiwen Shao, Gang Li, Antonio
Robles-Kelly and Ying He
- Abstract summary: A fundamental 3D vision task is the removal of noise, known as point cloud filtering or denoising.
We propose IterativePFN (iterative point cloud filtering network), which consists of multiple Iterations that model the true iterative filtering process internally.
Our method is able to obtain better performance compared to state-of-the-art methods.
- Score: 18.51768749680731
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The quality of point clouds is often limited by noise introduced during their
capture process. Consequently, a fundamental 3D vision task is the removal of
noise, known as point cloud filtering or denoising. State-of-the-art learning
based methods focus on training neural networks to infer filtered displacements
and directly shift noisy points onto the underlying clean surfaces. In high
noise conditions, they iterate the filtering process. However, this iterative
filtering is only done at test time and is less effective at ensuring points
converge quickly onto the clean surfaces. We propose IterativePFN (iterative
point cloud filtering network), which consists of multiple IterationModules
that model the true iterative filtering process internally, within a single
network. We train our IterativePFN network using a novel loss function that
utilizes an adaptive ground truth target at each iteration to capture the
relationship between intermediate filtering results during training. This
ensures that the filtered results converge faster to the clean surfaces. Our
method is able to obtain better performance compared to state-of-the-art
methods. The source code can be found at:
https://github.com/ddsediri/IterativePFN.
Related papers
- StraightPCF: Straight Point Cloud Filtering [50.66412286723848]
Point cloud filtering is a fundamental 3D vision task, which aims to remove noise while recovering the underlying clean surfaces.
We introduce StraightPCF, a new deep learning based method for point cloud filtering.
It works by moving noisy points along straight paths, thus reducing discretization errors while ensuring faster convergence to the clean surfaces.
arXiv Detail & Related papers (2024-05-14T05:41:59Z) - Efficient CNNs via Passive Filter Pruning [23.661189257759535]
Convolutional neural networks (CNNs) have shown state-of-the-art performance in various applications.
CNNs are resource-hungry due to their requirement of high computational complexity and memory storage.
Recent efforts toward achieving computational efficiency in CNNs involve filter pruning methods.
arXiv Detail & Related papers (2023-04-05T09:19:19Z) - Contrastive Learning for Joint Normal Estimation and Point Cloud
Filtering [12.602645108896636]
We propose a novel deep learning method to jointly estimate normals and filter point clouds.
We first introduce a 3D patch based contrastive learning framework, with noise corruption as an augmentation.
Experimental results show that our method well supports the two tasks simultaneously and preserves sharp features and fine details.
arXiv Detail & Related papers (2022-08-14T09:16:25Z) - Batch Normalization Tells You Which Filter is Important [49.903610684578716]
We propose a simple yet effective filter pruning method by evaluating the importance of each filter based on the BN parameters of pre-trained CNNs.
The experimental results on CIFAR-10 and ImageNet demonstrate that the proposed method can achieve outstanding performance.
arXiv Detail & Related papers (2021-12-02T12:04:59Z) - Unsharp Mask Guided Filtering [53.14430987860308]
The goal of this paper is guided image filtering, which emphasizes the importance of structure transfer during filtering.
We propose a new and simplified formulation of the guided filter inspired by unsharp masking.
Our formulation enjoys a filtering prior to a low-pass filter and enables explicit structure transfer by estimating a single coefficient.
arXiv Detail & Related papers (2021-06-02T19:15:34Z) - Training Interpretable Convolutional Neural Networks by Differentiating
Class-specific Filters [64.46270549587004]
Convolutional neural networks (CNNs) have been successfully used in a range of tasks.
CNNs are often viewed as "black-box" and lack of interpretability.
We propose a novel strategy to train interpretable CNNs by encouraging class-specific filters.
arXiv Detail & Related papers (2020-07-16T09:12:26Z) - Filter Grafting for Deep Neural Networks: Reason, Method, and
Cultivation [86.91324735966766]
Filter is the key component in modern convolutional neural networks (CNNs)
In this paper, we introduce filter grafting (textbfMethod) to achieve this goal.
We develop a novel criterion to measure the information of filters and an adaptive weighting strategy to balance the grafted information among networks.
arXiv Detail & Related papers (2020-04-26T08:36:26Z) - Convolutional Neural Network Pruning Using Filter Attenuation [10.282782377635106]
Filters are essential elements in convolutional neural networks (CNNs)
In filter pruning methods, a filter with all of its components, including channels and connections, are removed.
We propose a CNN pruning method based on filter attenuation in which weak filters are not directly removed.
arXiv Detail & Related papers (2020-02-09T06:31:24Z) - Filter Grafting for Deep Neural Networks [71.39169475500324]
Filter grafting aims to improve the representation capability of Deep Neural Networks (DNNs)
We develop an entropy-based criterion to measure the information of filters and an adaptive weighting strategy for balancing the grafted information among networks.
For example, the grafted MobileNetV2 outperforms the non-grafted MobileNetV2 by about 7 percent on CIFAR-100 dataset.
arXiv Detail & Related papers (2020-01-15T03:18:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.