Learning to Generate Realistic LiDAR Point Clouds
- URL: http://arxiv.org/abs/2209.03954v1
- Date: Thu, 8 Sep 2022 17:58:04 GMT
- Title: Learning to Generate Realistic LiDAR Point Clouds
- Authors: Vlas Zyrianov, Xiyue Zhu, Shenlong Wang
- Abstract summary: LiDARGen is a novel, effective, and controllable generative model that produces realistic LiDAR point cloud sensory readings.
We validate our method on the challenging KITTI-360 and NuScenes datasets.
- Score: 15.976199637414886
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present LiDARGen, a novel, effective, and controllable generative model
that produces realistic LiDAR point cloud sensory readings. Our method
leverages the powerful score-matching energy-based model and formulates the
point cloud generation process as a stochastic denoising process in the
equirectangular view. This model allows us to sample diverse and high-quality
point cloud samples with guaranteed physical feasibility and controllability.
We validate the effectiveness of our method on the challenging KITTI-360 and
NuScenes datasets. The quantitative and qualitative results show that our
approach produces more realistic samples than other generative models.
Furthermore, LiDARGen can sample point clouds conditioned on inputs without
retraining. We demonstrate that our proposed generative model could be directly
used to densify LiDAR point clouds. Our code is available at:
https://www.zyrianov.org/lidargen/
Related papers
- UltraLiDAR: Learning Compact Representations for LiDAR Completion and
Generation [51.443788294845845]
We present UltraLiDAR, a data-driven framework for scene-level LiDAR completion, LiDAR generation, and LiDAR manipulation.
We show that by aligning the representation of a sparse point cloud to that of a dense point cloud, we can densify the sparse point clouds.
By learning a prior over the discrete codebook, we can generate diverse, realistic LiDAR point clouds for self-driving.
arXiv Detail & Related papers (2023-11-02T17:57:03Z) - LiDAR Data Synthesis with Denoising Diffusion Probabilistic Models [1.1965844936801797]
Generative modeling of 3D LiDAR data is an emerging task with promising applications for autonomous mobile robots.
We present R2DM, a novel generative model for LiDAR data that can generate diverse and high-fidelity 3D scene point clouds.
Our method is built upon denoising diffusion probabilistic models (DDPMs), which have shown impressive results among generative model frameworks.
arXiv Detail & Related papers (2023-09-17T12:26:57Z) - NeRF-LiDAR: Generating Realistic LiDAR Point Clouds with Neural Radiance
Fields [20.887421720818892]
We present NeRF-LIDAR, a novel LiDAR simulation method that leverages real-world information to generate realistic LIDAR point clouds.
We verify the effectiveness of our NeRF-LiDAR by training different 3D segmentation models on the generated LiDAR point clouds.
arXiv Detail & Related papers (2023-04-28T12:41:28Z) - StarNet: Style-Aware 3D Point Cloud Generation [82.30389817015877]
StarNet is able to reconstruct and generate high-fidelity and even 3D point clouds using a mapping network.
Our framework achieves comparable state-of-the-art performance on various metrics in the point cloud reconstruction and generation tasks.
arXiv Detail & Related papers (2023-03-28T08:21:44Z) - Controllable Mesh Generation Through Sparse Latent Point Diffusion
Models [105.83595545314334]
We design a novel sparse latent point diffusion model for mesh generation.
Our key insight is to regard point clouds as an intermediate representation of meshes, and model the distribution of point clouds instead.
Our proposed sparse latent point diffusion model achieves superior performance in terms of generation quality and controllability.
arXiv Detail & Related papers (2023-03-14T14:25:29Z) - Learning to Simulate Realistic LiDARs [66.7519667383175]
We introduce a pipeline for data-driven simulation of a realistic LiDAR sensor.
We show that our model can learn to encode realistic effects such as dropped points on transparent surfaces.
We use our technique to learn models of two distinct LiDAR sensors and use them to improve simulated LiDAR data accordingly.
arXiv Detail & Related papers (2022-09-22T13:12:54Z) - Representing Point Clouds with Generative Conditional Invertible Flow
Networks [15.280751949071016]
We propose a simple yet effective method to represent point clouds as sets of samples drawn from a cloud-specific probability distribution.
We show that our method leverages generative invertible flow networks to learn embeddings as well as to generate point clouds.
Our model offers competitive or superior quantitative results on benchmark datasets.
arXiv Detail & Related papers (2020-10-07T18:30:47Z) - Learning Gradient Fields for Shape Generation [69.85355757242075]
A point cloud can be viewed as samples from a distribution of 3D points whose density is concentrated near the surface of the shape.
We generate point clouds by performing gradient ascent on an unnormalized probability density.
Our model directly predicts the gradient of the log density field and can be trained with a simple objective adapted from score-based generative models.
arXiv Detail & Related papers (2020-08-14T18:06:15Z) - Generative PointNet: Deep Energy-Based Learning on Unordered Point Sets
for 3D Generation, Reconstruction and Classification [136.57669231704858]
We propose a generative model of unordered point sets, such as point clouds, in the form of an energy-based model.
We call our model the Generative PointNet because it can be derived from the discriminative PointNet.
arXiv Detail & Related papers (2020-04-02T23:08:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.