All-photon Polarimetric Time-of-Flight Imaging
- URL: http://arxiv.org/abs/2112.09278v1
- Date: Fri, 17 Dec 2021 01:51:47 GMT
- Title: All-photon Polarimetric Time-of-Flight Imaging
- Authors: Seung-Hwan Baek, Felix Heide
- Abstract summary: Time-of-flight (ToF) sensors provide an imaging modality fueling diverse applications, including LiDAR in autonomous driving.
Conventional ToF imaging methods estimate the depth by sending pulses of light into a scene and measuring the ToF of the first-arriving photons.
We propose an all-photon ToF imaging method by incorporating the temporal-polarimetric analysis of first- and late-arriving photons.
- Score: 33.499684969102816
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Time-of-flight (ToF) sensors provide an imaging modality fueling diverse
applications, including LiDAR in autonomous driving, robotics, and augmented
reality. Conventional ToF imaging methods estimate the depth by sending pulses
of light into a scene and measuring the ToF of the first-arriving photons
directly reflected from a scene surface without any temporal delay. As such,
all photons following this first response are typically considered as unwanted
noise. In this paper, we depart from the principle of using first-arriving
photons and propose an all-photon ToF imaging method by incorporating the
temporal-polarimetric analysis of first- and late-arriving photons, which
possess rich scene information about its geometry and material. To this end, we
propose a novel temporal-polarimetric reflectance model, an efficient capture
method, and a reconstruction method that exploits the temporal-polarimetric
changes of light reflected by the surface and sub-surface reflection. The
proposed all-photon polarimetric ToF imaging method allows for acquiring depth,
surface normals, and material parameters of a scene by utilizing all photons
captured by the system, whereas conventional ToF imaging only obtains coarse
depth from the first-arriving photons. We validate our method in simulation and
experimentally with a prototype.
Related papers
- End-to-End Hybrid Refractive-Diffractive Lens Design with Differentiable Ray-Wave Model [18.183342315517244]
We propose a new hybrid ray-tracing and wave-propagation (ray-wave) model for accurate simulation of both optical aberrations and diffractive phase modulation.
The proposed ray-wave model is fully differentiable, enabling gradient back-propagation for end-to-end co-design of refractive-diffractive lens optimization and the image reconstruction network.
arXiv Detail & Related papers (2024-06-02T18:48:22Z) - Spatial and temporal characteristics of spontaneous parametric down-conversion with varying focal planes of interacting beams [0.0]
Spontaneous parametric down-conversion (SPDC) is a widely used process to prepare entangled photon pairs.
The exact focal plane position of the pump beam relative to those of the detection modes is difficult to determine in a real experiment.
In this work, we consider variable positions of focal planes and investigate how shifts of these focal planes influence the spatial and temporal properties of photon pairs.
arXiv Detail & Related papers (2022-12-23T20:04:24Z) - A Geometric Model for Polarization Imaging on Projective Cameras [5.381004207943598]
We present a geometric model describing how a general projective camera captures the light polarization state.
Our model is implemented as a pre-processing operation acting on raw images, followed by a per-pixel rotation of the reconstructed normal field.
Experiments on existing and new datasets demonstrate the accuracy of the model when applied to commercially available polarimetric cameras.
arXiv Detail & Related papers (2022-11-29T17:12:26Z) - Centimeter-Wave Free-Space Time-of-Flight Imaging [25.15384123485028]
We propose a computational imaging method for all-optical free-space correlation before photo-conversion that achieves micron-scale depth resolution.
We propose an imaging approach with resonant polarization modulators and devise a novel optical dual-pass frequency-doubling which achieves high modulation contrast at more than 10GHz.
We validate the proposed method in simulation and experimentally, where it achieves micron-scale depth precision.
arXiv Detail & Related papers (2021-05-25T01:57:10Z) - Passive Inter-Photon Imaging [18.739224941453983]
Digital camera pixels measure image intensities by converting incident light energy into an analog electrical current, and then digitizing it into a fixed-width binary representation.
This direct measurement method suffers from limited dynamic range and poor performance under extreme illumination.
We propose a novel intensity cue based on measuring inter-photon timing, defined as the time delay between detection of successive photons.
arXiv Detail & Related papers (2021-03-31T18:44:52Z) - Leveraging Spatial and Photometric Context for Calibrated Non-Lambertian
Photometric Stereo [61.6260594326246]
We introduce an efficient fully-convolutional architecture that can leverage both spatial and photometric context simultaneously.
Using separable 4D convolutions and 2D heat-maps reduces the size and makes more efficient.
arXiv Detail & Related papers (2021-03-22T18:06:58Z) - A learning-based view extrapolation method for axial super-resolution [52.748944517480155]
Axial light field resolution refers to the ability to distinguish features at different depths by refocusing.
We propose a learning-based method to extrapolate novel views from axial volumes of sheared epipolar plane images.
arXiv Detail & Related papers (2021-03-11T07:22:13Z) - Single-shot Hyperspectral-Depth Imaging with Learned Diffractive Optics [72.9038524082252]
We propose a compact single-shot monocular hyperspectral-depth (HS-D) imaging method.
Our method uses a diffractive optical element (DOE), the point spread function of which changes with respect to both depth and spectrum.
To facilitate learning the DOE, we present a first HS-D dataset by building a benchtop HS-D imager.
arXiv Detail & Related papers (2020-09-01T14:19:35Z) - Single Image Brightening via Multi-Scale Exposure Fusion with Hybrid
Learning [48.890709236564945]
A small ISO and a small exposure time are usually used to capture an image in the back or low light conditions.
In this paper, a single image brightening algorithm is introduced to brighten such an image.
The proposed algorithm includes a unique hybrid learning framework to generate two virtual images with large exposure times.
arXiv Detail & Related papers (2020-07-04T08:23:07Z) - Deep Photon Mapping [59.41146655216394]
In this paper, we develop the first deep learning-based method for particle-based rendering.
We train a novel deep neural network to predict a kernel function to aggregate photon contributions at shading points.
Our network encodes individual photons into per-photon features, aggregates them in the neighborhood of a shading point, and infers a kernel function from the per-photon and photon local context features.
arXiv Detail & Related papers (2020-04-25T06:59:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.