PCGen: Point Cloud Generator for LiDAR Simulation
- URL: http://arxiv.org/abs/2210.08738v1
- Date: Mon, 17 Oct 2022 04:13:21 GMT
- Title: PCGen: Point Cloud Generator for LiDAR Simulation
- Authors: Chenqi Li, Yuan Ren, Bingbing Liu
- Abstract summary: Existing methods generate data which are more noisy and complete than the real point clouds.
We propose FPA raycasting and surrogate model raydrop.
With minimal training data, the surrogate model can generalize to different geographies and scenes.
Results show that object detection models trained by simulation data can achieve similar result as the real data trained model.
- Score: 10.692184635629792
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Data is a fundamental building block for LiDAR perception systems.
Unfortunately, real-world data collection and annotation is extremely costly &
laborious. Recently, real data based LiDAR simulators have shown tremendous
potential to complement real data, due to their scalability and high-fidelity
compared to graphics engine based methods. Before simulation can be deployed in
the real-world, two shortcomings need to be addressed. First, existing methods
usually generate data which are more noisy and complete than the real point
clouds, due to 3D reconstruction error and pure geometry-based raycasting
method. Second, prior works on simulation for object detection focus solely on
rigid objects, like cars, but VRUs, like pedestrians, are important road
participants. To tackle the first challenge, we propose FPA raycasting and
surrogate model raydrop. FPA enables the simulation of both point cloud
coordinates and sensor features, while taking into account reconstruction
noise. The ray-wise surrogate raydrop model mimics the physical properties of
LiDAR's laser receiver to determine whether a simulated point would be recorded
by a real LiDAR. With minimal training data, the surrogate model can generalize
to different geographies and scenes, closing the domain gap between raycasted
and real point clouds. To tackle the simulation of deformable VRU simulation,
we employ SMPL dataset to provide a pedestrian simulation baseline and compare
the domain gap between CAD and reconstructed objects. Applying our pipeline to
perform novel sensor synthesis, results show that object detection models
trained by simulation data can achieve similar result as the real data trained
model.
Related papers
- LiDAR-GS:Real-time LiDAR Re-Simulation using Gaussian Splatting [50.808933338389686]
LiDAR simulation plays a crucial role in closed-loop simulation for autonomous driving.
We present LiDAR-GS, the first LiDAR Gaussian Splatting method, for real-time high-fidelity re-simulation of LiDAR sensor scans in public urban road scenes.
Our approach succeeds in simultaneously re-simulating depth, intensity, and ray-drop channels, achieving state-of-the-art results in both rendering frame rate and quality on publically available large scene datasets.
arXiv Detail & Related papers (2024-10-07T15:07:56Z) - GAN-Based LiDAR Intensity Simulation [3.8697834534260447]
We train GANs to translate between camera images and LiDAR scans from real test drives.
We test the performance of the LiDAR simulation by testing how well an object detection network generalizes between real and synthetic point clouds.
arXiv Detail & Related papers (2023-11-26T20:44:09Z) - Quantifying the LiDAR Sim-to-Real Domain Shift: A Detailed Investigation
Using Object Detectors and Analyzing Point Clouds at Target-Level [1.1999555634662635]
LiDAR object detection algorithms based on neural networks for autonomous driving require large amounts of data for training, validation, and testing.
We show that using simulated data for the training of neural networks leads to a domain shift of training and testing data due to differences in scenes, scenarios, and distributions.
arXiv Detail & Related papers (2023-03-03T12:52:01Z) - Grounding Graph Network Simulators using Physical Sensor Observations [12.017054986629846]
We integrate sensory information to ground Graph Network Simulators on real world observations.
We predict the mesh state of deformable objects by utilizing point cloud data.
arXiv Detail & Related papers (2023-02-23T09:06:42Z) - Learning to Simulate Realistic LiDARs [66.7519667383175]
We introduce a pipeline for data-driven simulation of a realistic LiDAR sensor.
We show that our model can learn to encode realistic effects such as dropped points on transparent surfaces.
We use our technique to learn models of two distinct LiDAR sensors and use them to improve simulated LiDAR data accordingly.
arXiv Detail & Related papers (2022-09-22T13:12:54Z) - A Realism Metric for Generated LiDAR Point Clouds [2.6205925938720833]
This paper presents a novel metric to quantify the realism of LiDAR point clouds.
Relevant features are learned from real-world and synthetic point clouds by training on a proxy classification task.
In a series of experiments, we demonstrate the application of our metric to determine the realism of generated LiDAR data and compare the realism estimation of our metric to the performance of a segmentation model.
arXiv Detail & Related papers (2022-08-31T16:37:57Z) - LiDAR Snowfall Simulation for Robust 3D Object Detection [116.10039516404743]
We propose a physically based method to simulate the effect of snowfall on real clear-weather LiDAR point clouds.
Our method samples snow particles in 2D space for each LiDAR line and uses the induced geometry to modify the measurement for each LiDAR beam.
We use our simulation to generate partially synthetic snowy LiDAR data and leverage these data for training 3D object detection models that are robust to snowfall.
arXiv Detail & Related papers (2022-03-28T21:48:26Z) - Fog Simulation on Real LiDAR Point Clouds for 3D Object Detection in
Adverse Weather [92.84066576636914]
This work addresses the challenging task of LiDAR-based 3D object detection in foggy weather.
We tackle this problem by simulating physically accurate fog into clear-weather scenes.
We are the first to provide strong 3D object detection baselines on the Seeing Through Fog dataset.
arXiv Detail & Related papers (2021-08-11T14:37:54Z) - Recovering and Simulating Pedestrians in the Wild [81.38135735146015]
We propose to recover the shape and motion of pedestrians from sensor readings captured in the wild by a self-driving car driving around.
We incorporate the reconstructed pedestrian assets bank in a realistic 3D simulation system.
We show that the simulated LiDAR data can be used to significantly reduce the amount of real-world data required for visual perception tasks.
arXiv Detail & Related papers (2020-11-16T17:16:32Z) - LiDARsim: Realistic LiDAR Simulation by Leveraging the Real World [84.57894492587053]
We develop a novel simulator that captures both the power of physics-based and learning-based simulation.
We first utilize ray casting over the 3D scene and then use a deep neural network to produce deviations from the physics-based simulation.
We showcase LiDARsim's usefulness for perception algorithms-testing on long-tail events and end-to-end closed-loop evaluation on safety-critical scenarios.
arXiv Detail & Related papers (2020-06-16T17:44:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.