Structured light with a million light planes per second
- URL: http://arxiv.org/abs/2411.18597v1
- Date: Wed, 27 Nov 2024 18:44:23 GMT
- Title: Structured light with a million light planes per second
- Authors: Dhawal Sirikonda, Praneeth Chakravarthula, Ioannis Gkioulekas, Adithya Pediredla,
- Abstract summary: We introduce a structured light system that captures full-frame depth at rates of a thousand frames per second.
Key innovation is the design of an acousto-optic light scanning device that can scan light planes at rates up to two million planes per second.
- Score: 17.590896196214914
- License:
- Abstract: We introduce a structured light system that captures full-frame depth at rates of a thousand frames per second, four times faster than the previous state of the art. Our key innovation to this end is the design of an acousto-optic light scanning device that can scan light planes at rates up to two million planes per second. We combine this device with an event camera for structured light, using the sparse events triggered on the camera as we sweep a light plane on the scene for depth triangulation. In contrast to prior work, where light scanning is the bottleneck towards faster structured light operation, our light scanning device is three orders of magnitude faster than the event camera's full-frame bandwidth, thus allowing us to take full advantage of the event camera's fast operation. To surpass this bandwidth, we additionally demonstrate adaptive scanning of only regions of interest, at speeds an order of magnitude faster than the theoretical full-frame limit for event cameras.
Related papers
- Event fields: Capturing light fields at high speed, resolution, and dynamic range [9.2152453085337]
"Event Fields" is a new approach that utilizes innovative optical designs for event cameras to capture light fields at high speed.
We develop the underlying mathematical framework for Event Fields and introduce two foundational frameworks to capture them practically.
This novel light-sensing paradigm opens doors to new applications in photography, robotics, and AR/VR, and presents fresh challenges in rendering and machine learning.
arXiv Detail & Related papers (2024-12-09T04:02:49Z) - Event Cameras Meet SPADs for High-Speed, Low-Bandwidth Imaging [25.13346470561497]
Event cameras and single-photon avalanche diode (SPAD) sensors have emerged as promising alternatives to conventional cameras.
We show that these properties are complementary, and can help achieve low-light, high-speed image reconstruction with low bandwidth requirements.
arXiv Detail & Related papers (2024-04-17T16:06:29Z) - Multi-Modal Neural Radiance Field for Monocular Dense SLAM with a
Light-Weight ToF Sensor [58.305341034419136]
We present the first dense SLAM system with a monocular camera and a light-weight ToF sensor.
We propose a multi-modal implicit scene representation that supports rendering both the signals from the RGB camera and light-weight ToF sensor.
Experiments demonstrate that our system well exploits the signals of light-weight ToF sensors and achieves competitive results.
arXiv Detail & Related papers (2023-08-28T07:56:13Z) - Fusing Frame and Event Vision for High-speed Optical Flow for Edge
Application [2.048335092363435]
Event cameras provide continuous asynchronous event streams overcoming the frame-rate limitation.
We fuse the complementary accuracy and speed advantages of the frame and event-based pipelines to provide high-speed optical flow.
arXiv Detail & Related papers (2022-07-21T19:15:05Z) - Single-Photon Structured Light [31.614032717665832]
"Single-Photon Structured Light" works by sensing binary images that indicates the presence or absence of photon arrivals during each exposure.
We develop novel temporal sequences using error correction codes that are designed to be robust to short-range effects like projector and camera defocus.
Our lab prototype is capable of 3D imaging in challenging scenarios involving objects with extremely low albedo or undergoing fast motion.
arXiv Detail & Related papers (2022-04-11T17:57:04Z) - 1000x Faster Camera and Machine Vision with Ordinary Devices [76.46540270145698]
We present vidar, a bit sequence array where each bit represents whether the accumulation of photons has reached a threshold.
We have developed a vidar camera that is 1,000x faster than conventional cameras.
We have also developed a spiking neural network-based machine vision system that combines the speed of the machine and the mechanism of biological vision.
arXiv Detail & Related papers (2022-01-23T16:10:11Z) - ESL: Event-based Structured Light [62.77144631509817]
Event cameras are bio-inspired sensors providing significant advantages over standard cameras.
We propose a novel structured-light system using an event camera to tackle the problem of accurate and high-speed depth sensing.
arXiv Detail & Related papers (2021-11-30T15:47:39Z) - Event Guided Depth Sensing [50.997474285910734]
We present an efficient bio-inspired event-camera-driven depth estimation algorithm.
In our approach, we illuminate areas of interest densely, depending on the scene activity detected by the event camera.
We show the feasibility of our approach in a simulated autonomous driving sequences and real indoor environments.
arXiv Detail & Related papers (2021-10-20T11:41:11Z) - Fast Generation and Detection of Spatial Modes of Light using an
Acousto-Optic Modulator [62.997667081978825]
spatial modes of light provide a high-dimensional space that can be used to encode both classical and quantum information.
Current approaches for dynamically generating and measuring these modes are slow, due to the need to reconfigure a high-resolution phase mask.
We experimentally realize this approach, using a double-pass AOM to generate one of five orbital angular momentum states.
We are able to reconstruct arbitrary states in under 1 ms with an average fidelity of 96.9%.
arXiv Detail & Related papers (2020-07-31T14:58:30Z) - Correlation Plenoptic Imaging between Arbitrary Planes [52.77024349608834]
We show that the protocol enables to change the focused planes, in post-processing, and to achieve an unprecedented combination of image resolution and depth of field.
Results lead the way towards the development of compact designs for correlation plenoptic imaging devices based on chaotic light, as well as high-SNR plenoptic imaging devices based on entangled photon illumination.
arXiv Detail & Related papers (2020-07-23T14:26:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.