Polarization Wavefront Lidar: Learning Large Scene Reconstruction from Polarized Wavefronts
- URL: http://arxiv.org/abs/2406.03461v2
- Date: Tue, 11 Jun 2024 09:56:15 GMT
- Title: Polarization Wavefront Lidar: Learning Large Scene Reconstruction from Polarized Wavefronts
- Authors: Dominik Scheuble, Chenyang Lei, Seung-Hwan Baek, Mario Bijelic, Felix Heide,
- Abstract summary: We introduce a novel long-range polarization wavefront lidar sensor (PolLidar) that modulates the polarization of the emitted and received light.
We leverage polarimetric wavefronts to estimate normals, distance, and material properties in outdoor scenarios with a novel learned reconstruction method.
- Score: 46.79906673307029
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Lidar has become a cornerstone sensing modality for 3D vision, especially for large outdoor scenarios and autonomous driving. Conventional lidar sensors are capable of providing centimeter-accurate distance information by emitting laser pulses into a scene and measuring the time-of-flight (ToF) of the reflection. However, the polarization of the received light that depends on the surface orientation and material properties is usually not considered. As such, the polarization modality has the potential to improve scene reconstruction beyond distance measurements. In this work, we introduce a novel long-range polarization wavefront lidar sensor (PolLidar) that modulates the polarization of the emitted and received light. Departing from conventional lidar sensors, PolLidar allows access to the raw time-resolved polarimetric wavefronts. We leverage polarimetric wavefronts to estimate normals, distance, and material properties in outdoor scenarios with a novel learned reconstruction method. To train and evaluate the method, we introduce a simulated and real-world long-range dataset with paired raw lidar data, ground truth distance, and normal maps. We find that the proposed method improves normal and distance reconstruction by 53\% mean angular error and 41\% mean absolute error compared to existing shape-from-polarization (SfP) and ToF methods. Code and data are open-sourced at https://light.princeton.edu/pollidar.
Related papers
- Robust Depth Enhancement via Polarization Prompt Fusion Tuning [112.88371907047396]
We present a framework that leverages polarization imaging to improve inaccurate depth measurements from various depth sensors.
Our method first adopts a learning-based strategy where a neural network is trained to estimate a dense and complete depth map from polarization data and a sensor depth map from different sensors.
To further improve the performance, we propose a Polarization Prompt Fusion Tuning (PPFT) strategy to effectively utilize RGB-based models pre-trained on large-scale datasets.
arXiv Detail & Related papers (2024-04-05T17:55:33Z) - Polarimetric Information for Multi-Modal 6D Pose Estimation of
Photometrically Challenging Objects with Limited Data [51.95347650131366]
6D pose estimation pipelines that rely on RGB-only or RGB-D data show limitations for photometrically challenging objects.
A supervised learning-based method utilising complementary polarisation information is proposed to overcome such limitations.
arXiv Detail & Related papers (2023-08-21T10:56:00Z) - UnLoc: A Universal Localization Method for Autonomous Vehicles using
LiDAR, Radar and/or Camera Input [51.150605800173366]
UnLoc is a novel unified neural modeling approach for localization with multi-sensor input in all weather conditions.
Our method is extensively evaluated on Oxford Radar RobotCar, ApolloSouthBay and Perth-WA datasets.
arXiv Detail & Related papers (2023-07-03T04:10:55Z) - Polarimetric Multi-View Inverse Rendering [13.391866136230165]
A polarization camera has great potential for 3D reconstruction since the angle of polarization (AoP) and the degree of polarization (DoP) of reflected light are related to an object's surface normal.
We propose a novel 3D reconstruction method called Polarimetric Multi-View Inverse Rendering (Polarimetric MVIR) that effectively exploits geometric, photometric, and polarimetric cues extracted from input multi-view color-polarization images.
arXiv Detail & Related papers (2022-12-24T12:12:12Z) - Sparse Ellipsometry: Portable Acquisition of Polarimetric SVBRDF and
Shape with Unstructured Flash Photography [32.68190169944569]
We present a portable polarimetric acquisition method that captures both polarimetric SVBRDF and 3D shape simultaneously.
Instead of days, the total acquisition time varies between twenty and thirty minutes per object.
Our results show a strong agreement with a recent ground-truth dataset of captured polarimetric BRDFs of real-world objects.
arXiv Detail & Related papers (2022-07-09T09:42:59Z) - KERPLE: Kernelized Relative Positional Embedding for Length
Extrapolation [72.71398034617607]
KERPLE is a framework that generalizes relative position embedding for extrapolation by kernelizing positional differences.
The diversity of CPD kernels allows us to derive various RPEs that enable length extrapolation in a principled way.
arXiv Detail & Related papers (2022-05-20T01:25:57Z) - Shape from Polarization for Complex Scenes in the Wild [93.65746187211958]
We present a new data-driven approach with physics-based priors to scene-level normal estimation from a single polarization image.
We contribute the first real-world scene-level SfP dataset with paired input polarization images and ground-truth normal maps.
Our trained model can be generalized to far-field outdoor scenes as the relationship between polarized light and surface normals is not affected by distance.
arXiv Detail & Related papers (2021-12-21T17:30:23Z) - Polarimetric Monocular Dense Mapping Using Relative Deep Depth Prior [8.552832023331248]
We propose an online reconstruction method that uses full polarimetric cues available from the polarization camera.
Our method is able to significantly improve the accuracy of the depthmap as well as increase its density, specially in regions of poor texture.
arXiv Detail & Related papers (2021-02-10T01:34:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.