Event-based Shape from Polarization
- URL: http://arxiv.org/abs/2301.06855v2
- Date: Tue, 11 Apr 2023 14:50:04 GMT
- Title: Event-based Shape from Polarization
- Authors: Manasi Muglikar, Leonard Bauersfeld, Diederik Paul Moeys, Davide
Scaramuzza
- Abstract summary: State-of-the-art solutions for Shape-from-Polarization (SfP) suffer from a speed-resolution tradeoff.
We tackle this tradeoff using event cameras.
We propose a setup that consists of a linear polarizer rotating at high-speeds in front of an event camera.
- Score: 43.483063713471935
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: State-of-the-art solutions for Shape-from-Polarization (SfP) suffer from a
speed-resolution tradeoff: they either sacrifice the number of polarization
angles measured or necessitate lengthy acquisition times due to framerate
constraints, thus compromising either accuracy or latency. We tackle this
tradeoff using event cameras. Event cameras operate at microseconds resolution
with negligible motion blur, and output a continuous stream of events that
precisely measures how light changes over time asynchronously. We propose a
setup that consists of a linear polarizer rotating at high-speeds in front of
an event camera. Our method uses the continuous event stream caused by the
rotation to reconstruct relative intensities at multiple polarizer angles.
Experiments demonstrate that our method outperforms physics-based baselines
using frames, reducing the MAE by 25% in synthetic and real-world dataset. In
the real world, we observe, however, that the challenging conditions (i.e.,
when few events are generated) harm the performance of physics-based solutions.
To overcome this, we propose a learning-based approach that learns to estimate
surface normals even at low event-rates, improving the physics-based approach
by 52% on the real world dataset. The proposed system achieves an acquisition
speed equivalent to 50 fps (>twice the framerate of the commercial polarization
sensor) while retaining the spatial resolution of 1MP. Our evaluation is based
on the first large-scale dataset for event-based SfP
Related papers
- Low-Latency Scalable Streaming for Event-Based Vision [0.5242869847419834]
We propose a scalable streaming method for event-based data based on Media Over QUIC.
We show that a state-of-the-art object detection application is resilient to dramatic data loss.
We observe an average reduction in detection mAP as low as 0.36.
arXiv Detail & Related papers (2024-12-10T19:48:57Z) - ESVO2: Direct Visual-Inertial Odometry with Stereo Event Cameras [33.81592783496106]
Event-based visual odometry aims at solving tracking and mapping subproblems (typically in parallel)
We build an event-based stereo visual-inertial odometry system on top of a direct pipeline.
The resulting system scales well with modern high-resolution event cameras.
arXiv Detail & Related papers (2024-10-12T05:35:27Z) - Event-based Continuous Color Video Decompression from Single Frames [36.4263932473053]
We present ContinuityCam, a novel approach to generate a continuous video from a single static RGB image and an event camera stream.
Our approach combines continuous long-range motion modeling with a neural synthesis model, enabling frame prediction at arbitrary times within the events.
arXiv Detail & Related papers (2023-11-30T18:59:23Z) - Event-based Background-Oriented Schlieren [18.2247510082534]
Schlieren imaging is an optical technique to observe the flow of transparent media, such as air or water, without any particle seeding.
Event cameras offer potential advantages (high dynamic range, high temporal resolution, and data efficiency) to overcome such limitations due to their bio-inspired sensing principle.
This paper presents a novel technique for perceiving air convection using events and frames by providing the first theoretical analysis that connects event data and schlieren.
arXiv Detail & Related papers (2023-11-01T10:57:20Z) - A 5-Point Minimal Solver for Event Camera Relative Motion Estimation [47.45081895021988]
We introduce a novel minimal 5-point solver that estimates line parameters and linear camera velocity projections, which can be fused into a single, averaged linear velocity when considering multiple lines.
Our method consistently achieves a 100% success rate in estimating linear velocity where existing closed-form solvers only achieve between 23% and 70%.
arXiv Detail & Related papers (2023-09-29T08:30:18Z) - Event-aided Direct Sparse Odometry [54.602311491827805]
We introduce EDS, a direct monocular visual odometry using events and frames.
Our algorithm leverages the event generation model to track the camera motion in the blind time between frames.
EDS is the first method to perform 6-DOF VO using events and frames with a direct approach.
arXiv Detail & Related papers (2022-04-15T20:40:29Z) - Dense Optical Flow from Event Cameras [55.79329250951028]
We propose to incorporate feature correlation and sequential processing into dense optical flow estimation from event cameras.
Our proposed approach computes dense optical flow and reduces the end-point error by 23% on MVSEC.
arXiv Detail & Related papers (2021-08-24T07:39:08Z) - TimeLens: Event-based Video Frame Interpolation [54.28139783383213]
We introduce Time Lens, a novel indicates equal contribution method that leverages the advantages of both synthesis-based and flow-based approaches.
We show an up to 5.21 dB improvement in terms of PSNR over state-of-the-art frame-based and event-based methods.
arXiv Detail & Related papers (2021-06-14T10:33:47Z) - Learning Monocular Dense Depth from Events [53.078665310545745]
Event cameras produce brightness changes in the form of a stream of asynchronous events instead of intensity frames.
Recent learning-based approaches have been applied to event-based data, such as monocular depth prediction.
We propose a recurrent architecture to solve this task and show significant improvement over standard feed-forward methods.
arXiv Detail & Related papers (2020-10-16T12:36:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.