Lossless Coding of Point Cloud Geometry using a Deep Generative Model
- URL: http://arxiv.org/abs/2107.00400v1
- Date: Thu, 1 Jul 2021 12:20:22 GMT
- Title: Lossless Coding of Point Cloud Geometry using a Deep Generative Model
- Authors: Dat Thanh Nguyen, Maurice Quach, Giuseppe Valenzise, Pierre Duhamel
- Abstract summary: Method adaptively partitions a point cloud into multiple voxel block sizes.
Deep auto-regressive generative model estimates occupancy probability of each voxel.
We employ estimated probabilities to code efficiently a block using a context-based arithmetic coder.
- Score: 11.69103847045569
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper proposes a lossless point cloud (PC) geometry compression method
that uses neural networks to estimate the probability distribution of voxel
occupancy. First, to take into account the PC sparsity, our method adaptively
partitions a point cloud into multiple voxel block sizes. This partitioning is
signalled via an octree. Second, we employ a deep auto-regressive generative
model to estimate the occupancy probability of each voxel given the previously
encoded ones. We then employ the estimated probabilities to code efficiently a
block using a context-based arithmetic coder. Our context has variable size and
can expand beyond the current block to learn more accurate probabilities. We
also consider using data augmentation techniques to increase the generalization
capability of the learned probability models, in particular in the presence of
noise and lower-density point clouds. Experimental evaluation, performed on a
variety of point clouds from four different datasets and with diverse
characteristics, demonstrates that our method reduces significantly (by up to
30%) the rate for lossless coding compared to the state-of-the-art MPEG codec.
Related papers
- Point Cloud Compression with Bits-back Coding [32.9521748764196]
This paper specializes in using a deep learning-based probabilistic model to estimate the Shannon's entropy of the point cloud information.
Once the entropy of the point cloud dataset is estimated, we use the learned CVAE model to compress the geometric attributes of the point clouds.
The novelty of our method with bits-back coding specializes in utilizing the learned latent variable model of the CVAE to compress the point cloud data.
arXiv Detail & Related papers (2024-10-09T06:34:48Z) - PVContext: Hybrid Context Model for Point Cloud Compression [61.24130634750288]
We propose PVContext, a hybrid context model for effective octree-based point cloud compression.
PVContext comprises two components with distinct modalities: the Voxel Context, which accurately represents local geometric information using voxels, and the Point Context, which efficiently preserves global shape information from point clouds.
arXiv Detail & Related papers (2024-09-19T12:47:35Z) - Geometric Prior Based Deep Human Point Cloud Geometry Compression [67.49785946369055]
We leverage the human geometric prior in geometry redundancy removal of point clouds.
We can envisage high-resolution human point clouds as a combination of geometric priors and structural deviations.
The proposed framework can operate in a play-and-plug fashion with existing learning based point cloud compression methods.
arXiv Detail & Related papers (2023-05-02T10:35:20Z) - Lossless Point Cloud Geometry and Attribute Compression Using a Learned Conditional Probability Model [2.670322123407995]
We present an efficient point cloud compression method that uses tensor-based deep neural networks to learn point cloud geometry and color probability.
Our method represents a point cloud with both occupancy feature and three features at different bit depths in a unified representation.
arXiv Detail & Related papers (2023-03-11T23:50:02Z) - Deep probabilistic model for lossless scalable point cloud attribute
compression [2.2559617939136505]
We build an end-to-end point cloud attribute coding method (MNeT) that progressively projects the attributes onto multiscale latent spaces.
We validate our method on a set of point clouds from MVUB and MPEG and show that our method outperforms recently proposed methods and on par with the latest G-PCC version 14.
arXiv Detail & Related papers (2023-03-11T23:39:30Z) - Deep Geometry Post-Processing for Decompressed Point Clouds [32.72083309729585]
Point cloud compression plays a crucial role in reducing the huge cost of data storage and transmission.
We propose a novel learning-based post-processing method to enhance the decompressed point clouds.
Experimental results show that the proposed method can significantly improve the quality of the decompressed point clouds.
arXiv Detail & Related papers (2022-04-29T08:57:03Z) - Scaling Structured Inference with Randomization [64.18063627155128]
We propose a family of dynamic programming (RDP) randomized for scaling structured models to tens of thousands of latent states.
Our method is widely applicable to classical DP-based inference.
It is also compatible with automatic differentiation so can be integrated with neural networks seamlessly.
arXiv Detail & Related papers (2021-12-07T11:26:41Z) - Learning-based lossless compression of 3D point cloud geometry [11.69103847045569]
encoder operates in a hybrid mode, mixing octree and voxel-based coding.
Our method outperforms the state-of-the-art MPEG G-PCC standard with average rate savings of 28%.
arXiv Detail & Related papers (2020-11-30T11:27:16Z) - Deep Magnification-Flexible Upsampling over 3D Point Clouds [103.09504572409449]
We propose a novel end-to-end learning-based framework to generate dense point clouds.
We first formulate the problem explicitly, which boils down to determining the weights and high-order approximation errors.
Then, we design a lightweight neural network to adaptively learn unified and sorted weights as well as the high-order refinements.
arXiv Detail & Related papers (2020-11-25T14:00:18Z) - MuSCLE: Multi Sweep Compression of LiDAR using Deep Entropy Models [78.93424358827528]
We present a novel compression algorithm for reducing the storage streams of LiDAR sensor data.
Our method significantly reduces the joint geometry and intensity over prior state-of-the-art LiDAR compression methods.
arXiv Detail & Related papers (2020-11-15T17:41:14Z) - OctSqueeze: Octree-Structured Entropy Model for LiDAR Compression [77.8842824702423]
We present a novel deep compression algorithm to reduce the memory footprint of LiDAR point clouds.
Our method exploits the sparsity and structural redundancy between points to reduce the memory footprint.
Our algorithm can be used to reduce the onboard and offboard storage of LiDAR points for applications such as self-driving cars.
arXiv Detail & Related papers (2020-05-14T17:48:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.