Pct: Point cloud transformer
- URL: http://arxiv.org/abs/2012.09688v3
- Date: Sat, 27 Mar 2021 04:36:37 GMT
- Title: Pct: Point cloud transformer
- Authors: Meng-Hao Guo, Jun-Xiong Cai, Zheng-Ning Liu, Tai-Jiang Mu, Ralph R.
Martin and Shi-Min Hu
- Abstract summary: This paper presents a novel framework named Point Cloud Transformer for point cloud learning.
PCT is based on Transformer, which achieves huge success in natural language processing.
It is inherently permutation invariant for processing a sequence of points, making it well-suited for point cloud learning.
- Score: 35.34343810480954
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The irregular domain and lack of ordering make it challenging to design deep
neural networks for point cloud processing. This paper presents a novel
framework named Point Cloud Transformer(PCT) for point cloud learning. PCT is
based on Transformer, which achieves huge success in natural language
processing and displays great potential in image processing. It is inherently
permutation invariant for processing a sequence of points, making it
well-suited for point cloud learning. To better capture local context within
the point cloud, we enhance input embedding with the support of farthest point
sampling and nearest neighbor search. Extensive experiments demonstrate that
the PCT achieves the state-of-the-art performance on shape classification, part
segmentation and normal estimation tasks.
Related papers
- Rendering-Oriented 3D Point Cloud Attribute Compression using Sparse Tensor-based Transformer [52.40992954884257]
3D visualization techniques have fundamentally transformed how we interact with digital content.
Massive data size of point clouds presents significant challenges in data compression.
We propose an end-to-end deep learning framework that seamlessly integrates PCAC with differentiable rendering.
arXiv Detail & Related papers (2024-11-12T16:12:51Z) - Adaptive Point Transformer [88.28498667506165]
Adaptive Point Cloud Transformer (AdaPT) is a standard PT model augmented by an adaptive token selection mechanism.
AdaPT dynamically reduces the number of tokens during inference, enabling efficient processing of large point clouds.
arXiv Detail & Related papers (2024-01-26T13:24:45Z) - Parametric Surface Constrained Upsampler Network for Point Cloud [33.033469444588086]
We introduce a novel surface regularizer into the upsampler network by forcing the neural network to learn the underlying parametric surface represented by bicubic functions and rotation functions.
These designs are integrated into two different networks for two tasks that take advantages of upsampling layers.
The state-of-the-art experimental results on both tasks demonstrate the effectiveness of the proposed method.
arXiv Detail & Related papers (2023-03-14T21:12:54Z) - SeedFormer: Patch Seeds based Point Cloud Completion with Upsample
Transformer [46.800630776714016]
We propose a novel SeedFormer to improve the ability of detail preservation and recovery in point cloud completion.
We introduce a new shape representation, namely Patch Seeds, which not only captures general structures from partial inputs but also preserves regional information of local patterns.
Our method outperforms state-of-the-art completion networks on several benchmark datasets.
arXiv Detail & Related papers (2022-07-21T06:15:59Z) - PMP-Net++: Point Cloud Completion by Transformer-Enhanced Multi-step
Point Moving Paths [60.32185890237936]
We design a novel neural network, named PMP-Net++, to mimic behavior of an earth mover.
It moves each point of incomplete input to obtain a complete point cloud, where total distance of point moving paths (PMPs) should be the shortest.
The network learns a strict and unique correspondence on point-level, and thus improves quality of predicted complete shape.
arXiv Detail & Related papers (2022-02-19T03:00:40Z) - Point-BERT: Pre-training 3D Point Cloud Transformers with Masked Point
Modeling [104.82953953453503]
We present Point-BERT, a new paradigm for learning Transformers to generalize the concept of BERT to 3D point cloud.
Experiments demonstrate that the proposed BERT-style pre-training strategy significantly improves the performance of standard point cloud Transformers.
arXiv Detail & Related papers (2021-11-29T18:59:03Z) - PU-Transformer: Point Cloud Upsampling Transformer [38.05362492645094]
We focus on the point cloud upsampling task that intends to generate dense high-fidelity point clouds from sparse input data.
Specifically, to activate the transformer's strong capability in representing features, we develop a new variant of a multi-head self-attention structure.
We demonstrate the outstanding performance of our approach by comparing with the state-of-the-art CNN-based methods on different benchmarks.
arXiv Detail & Related papers (2021-11-24T03:25:35Z) - PoinTr: Diverse Point Cloud Completion with Geometry-Aware Transformers [81.71904691925428]
We present a new method that reformulates point cloud completion as a set-to-set translation problem.
We also design a new model, called PoinTr, that adopts a transformer encoder-decoder architecture for point cloud completion.
Our method outperforms state-of-the-art methods by a large margin on both the new benchmarks and the existing ones.
arXiv Detail & Related papers (2021-08-19T17:58:56Z) - Permutation Matters: Anisotropic Convolutional Layer for Learning on
Point Clouds [145.79324955896845]
We propose a permutable anisotropic convolutional operation (PAI-Conv) that calculates soft-permutation matrices for each point.
Experiments on point clouds demonstrate that PAI-Conv produces competitive results in classification and semantic segmentation tasks.
arXiv Detail & Related papers (2020-05-27T02:42:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.