Self-Supervised Arbitrary-Scale Point Clouds Upsampling via Implicit
Neural Representation
- URL: http://arxiv.org/abs/2204.08196v1
- Date: Mon, 18 Apr 2022 07:18:25 GMT
- Title: Self-Supervised Arbitrary-Scale Point Clouds Upsampling via Implicit
Neural Representation
- Authors: Wenbo Zhao, Xianming Liu, Zhiwei Zhong, Junjun Jiang, Wei Gao, Ge Li,
Xiangyang Ji
- Abstract summary: We propose a novel approach that achieves self-supervised and magnification-flexible point clouds upsampling simultaneously.
Experimental results demonstrate that our self-supervised learning based scheme achieves competitive or even better performance than supervised learning based state-of-the-art methods.
- Score: 79.60988242843437
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Point clouds upsampling is a challenging issue to generate dense and uniform
point clouds from the given sparse input. Most existing methods either take the
end-to-end supervised learning based manner, where large amounts of pairs of
sparse input and dense ground-truth are exploited as supervision information;
or treat up-scaling of different scale factors as independent tasks, and have
to build multiple networks to handle upsampling with varying factors. In this
paper, we propose a novel approach that achieves self-supervised and
magnification-flexible point clouds upsampling simultaneously. We formulate
point clouds upsampling as the task of seeking nearest projection points on the
implicit surface for seed points. To this end, we define two implicit neural
functions to estimate projection direction and distance respectively, which can
be trained by two pretext learning tasks. Experimental results demonstrate that
our self-supervised learning based scheme achieves competitive or even better
performance than supervised learning based state-of-the-art methods. The source
code is publicly available at https://github.com/xnowbzhao/sapcu.
Related papers
- Learning Continuous Implicit Field with Local Distance Indicator for
Arbitrary-Scale Point Cloud Upsampling [55.05706827963042]
Point cloud upsampling aims to generate dense and uniformly distributed point sets from a sparse point cloud.
Previous methods typically split a sparse point cloud into several local patches, upsample patch points, and merge all upsampled patches.
We propose a novel approach that learns an unsigned distance field guided by local priors for point cloud upsampling.
arXiv Detail & Related papers (2023-12-23T01:52:14Z) - Grad-PU: Arbitrary-Scale Point Cloud Upsampling via Gradient Descent
with Learned Distance Functions [77.32043242988738]
We propose a new framework for accurate point cloud upsampling that supports arbitrary upsampling rates.
Our method first interpolates the low-res point cloud according to a given upsampling rate.
arXiv Detail & Related papers (2023-04-24T06:36:35Z) - Parametric Surface Constrained Upsampler Network for Point Cloud [33.033469444588086]
We introduce a novel surface regularizer into the upsampler network by forcing the neural network to learn the underlying parametric surface represented by bicubic functions and rotation functions.
These designs are integrated into two different networks for two tasks that take advantages of upsampling layers.
The state-of-the-art experimental results on both tasks demonstrate the effectiveness of the proposed method.
arXiv Detail & Related papers (2023-03-14T21:12:54Z) - BIMS-PU: Bi-Directional and Multi-Scale Point Cloud Upsampling [60.257912103351394]
We develop a new point cloud upsampling pipeline called BIMS-PU.
We decompose the up/downsampling procedure into several up/downsampling sub-steps by breaking the target sampling factor into smaller factors.
We show that our method achieves superior results to state-of-the-art approaches.
arXiv Detail & Related papers (2022-06-25T13:13:37Z) - SPU-Net: Self-Supervised Point Cloud Upsampling by Coarse-to-Fine
Reconstruction with Self-Projection Optimization [52.20602782690776]
It is expensive and tedious to obtain large scale paired sparse-canned point sets for training from real scanned sparse data.
We propose a self-supervised point cloud upsampling network, named SPU-Net, to capture the inherent upsampling patterns of points lying on the underlying object surface.
We conduct various experiments on both synthetic and real-scanned datasets, and the results demonstrate that we achieve comparable performance to the state-of-the-art supervised methods.
arXiv Detail & Related papers (2020-12-08T14:14:09Z) - Self-Sampling for Neural Point Cloud Consolidation [83.31236364265403]
We introduce a novel technique for neural point cloud consolidation which learns from only the input point cloud.
We repeatedly self-sample the input point cloud with global subsets that are used to train a deep neural network.
We demonstrate the ability to consolidate point sets from a variety of shapes, while eliminating outliers and noise.
arXiv Detail & Related papers (2020-08-14T17:16:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.