AU-PD: An Arbitrary-size and Uniform Downsampling Framework for Point
Clouds
- URL: http://arxiv.org/abs/2211.01110v1
- Date: Wed, 2 Nov 2022 13:37:16 GMT
- Title: AU-PD: An Arbitrary-size and Uniform Downsampling Framework for Point
Clouds
- Authors: Peng Zhang, Ruoyin Xie, Jinsheng Sun, Weiqing Li, and Zhiyong Su
- Abstract summary: We introduce the AU-PD, a novel task-aware sampling framework that directly downsamples point cloud to any smaller size.
We refine the pre-sampled set to make it task-aware, driven by downstream task losses.
With the attention mechanism and proper training scheme, the framework learns to adaptively refine the pre-sampled set of different sizes.
- Score: 6.786701761788659
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Point cloud downsampling is a crucial pre-processing operation to downsample
the points in the point cloud in order to reduce computational cost, and
communication load, to name a few. Recent research on point cloud downsampling
has achieved great success which concentrates on learning to sample in a
task-aware way. However, existing learnable samplers can not perform
arbitrary-size sampling directly. Moreover, their sampled results always
comprise many overlapping points. In this paper, we introduce the AU-PD, a
novel task-aware sampling framework that directly downsamples point cloud to
any smaller size based on a sample-to-refine strategy. Given a specified
arbitrary size, we first perform task-agnostic pre-sampling to sample the input
point cloud. Then, we refine the pre-sampled set to make it task-aware, driven
by downstream task losses. The refinement is realized by adding each
pre-sampled point with a small offset predicted by point-wise multi-layer
perceptrons (MLPs). In this way, the sampled set remains almost unchanged from
the original in distribution, and therefore contains fewer overlapping cases.
With the attention mechanism and proper training scheme, the framework learns
to adaptively refine the pre-sampled set of different sizes. We evaluate
sampled results for classification and registration tasks, respectively. The
proposed AU-PD gets competitive downstream performance with the
state-of-the-art method while being more flexible and containing fewer
overlapping points in the sampled set. The source code will be publicly
available at https://zhiyongsu.github.io/Project/AUPD.html.
Related papers
- Learning Continuous Implicit Field with Local Distance Indicator for
Arbitrary-Scale Point Cloud Upsampling [55.05706827963042]
Point cloud upsampling aims to generate dense and uniformly distributed point sets from a sparse point cloud.
Previous methods typically split a sparse point cloud into several local patches, upsample patch points, and merge all upsampled patches.
We propose a novel approach that learns an unsigned distance field guided by local priors for point cloud upsampling.
arXiv Detail & Related papers (2023-12-23T01:52:14Z) - Grad-PU: Arbitrary-Scale Point Cloud Upsampling via Gradient Descent
with Learned Distance Functions [77.32043242988738]
We propose a new framework for accurate point cloud upsampling that supports arbitrary upsampling rates.
Our method first interpolates the low-res point cloud according to a given upsampling rate.
arXiv Detail & Related papers (2023-04-24T06:36:35Z) - APSNet: Attention Based Point Cloud Sampling [0.7734726150561088]
We develop an attention-based point cloud sampling network (APSNet) to tackle this problem.
Both supervised learning and knowledge distillation-based self-supervised learning of APSNet are proposed.
Experiments demonstrate the superior performance of APSNet against state-of-the-arts in various downstream tasks.
arXiv Detail & Related papers (2022-10-11T17:30:46Z) - Arbitrary Point Cloud Upsampling with Spherical Mixture of Gaussians [1.2375561840897737]
APU-SMOG is a Transformer-based model for Arbitrary Point cloud Upsampling (APU)
APU-SMOG outperforms state-of-the-art fixed-ratio methods.
arXiv Detail & Related papers (2022-08-10T11:10:16Z) - BIMS-PU: Bi-Directional and Multi-Scale Point Cloud Upsampling [60.257912103351394]
We develop a new point cloud upsampling pipeline called BIMS-PU.
We decompose the up/downsampling procedure into several up/downsampling sub-steps by breaking the target sampling factor into smaller factors.
We show that our method achieves superior results to state-of-the-art approaches.
arXiv Detail & Related papers (2022-06-25T13:13:37Z) - Self-Supervised Arbitrary-Scale Point Clouds Upsampling via Implicit
Neural Representation [79.60988242843437]
We propose a novel approach that achieves self-supervised and magnification-flexible point clouds upsampling simultaneously.
Experimental results demonstrate that our self-supervised learning based scheme achieves competitive or even better performance than supervised learning based state-of-the-art methods.
arXiv Detail & Related papers (2022-04-18T07:18:25Z) - Meta-Sampler: Almost-Universal yet Task-Oriented Sampling for Point
Clouds [46.33828400918886]
We show how we can train an almost-universal meta-sampler across multiple tasks.
This meta-sampler can then be rapidly fine-tuned when applied to different datasets, networks, or even different tasks.
arXiv Detail & Related papers (2022-03-30T02:21:34Z) - Beyond Farthest Point Sampling in Point-Wise Analysis [52.218037492342546]
We propose a novel data-driven sampler learning strategy for point-wise analysis tasks.
We learn sampling and downstream applications jointly.
Our experiments show that jointly learning of the sampler and task brings remarkable improvement over previous baseline methods.
arXiv Detail & Related papers (2021-07-09T08:08:44Z) - PointLIE: Locally Invertible Embedding for Point Cloud Sampling and
Recovery [35.353458457283544]
Point Cloud Sampling and Recovery (PCSR) is critical for massive real-time point cloud collection and processing.
We propose a novel Locally Invertible Embedding for point cloud adaptive sampling and recovery (PointLIE)
PointLIE unifies point cloud sampling and upsampling to one single framework through bi-directional learning.
arXiv Detail & Related papers (2021-04-30T05:55:59Z) - Learning a Unified Sample Weighting Network for Object Detection [113.98404690619982]
Region sampling or weighting is significantly important to the success of modern region-based object detectors.
We argue that sample weighting should be data-dependent and task-dependent.
We propose a unified sample weighting network to predict a sample's task weights.
arXiv Detail & Related papers (2020-06-11T16:19:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.