RGB-D-E: Event Camera Calibration for Fast 6-DOF Object Tracking
- URL: http://arxiv.org/abs/2006.05011v2
- Date: Wed, 5 Aug 2020 20:41:29 GMT
- Title: RGB-D-E: Event Camera Calibration for Fast 6-DOF Object Tracking
- Authors: Etienne Dubeau, Mathieu Garon, Benoit Debaque, Raoul de Charette,
Jean-Fran\c{c}ois Lalonde
- Abstract summary: We propose to use an event-based camera to increase the speed of 3D object tracking in 6 degrees of freedom.
This application requires handling very high object speed to convey compelling AR experiences.
We develop a deep learning approach, which combines an existing RGB-D network along with a novel event-based network in a cascade fashion.
- Score: 16.06615504110132
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Augmented reality devices require multiple sensors to perform various tasks
such as localization and tracking. Currently, popular cameras are mostly
frame-based (e.g. RGB and Depth) which impose a high data bandwidth and power
usage. With the necessity for low power and more responsive augmented reality
systems, using solely frame-based sensors imposes limits to the various
algorithms that needs high frequency data from the environement. As such,
event-based sensors have become increasingly popular due to their low power,
bandwidth and latency, as well as their very high frequency data acquisition
capabilities. In this paper, we propose, for the first time, to use an
event-based camera to increase the speed of 3D object tracking in 6 degrees of
freedom. This application requires handling very high object speed to convey
compelling AR experiences. To this end, we propose a new system which combines
a recent RGB-D sensor (Kinect Azure) with an event camera (DAVIS346). We
develop a deep learning approach, which combines an existing RGB-D network
along with a novel event-based network in a cascade fashion, and demonstrate
that our approach significantly improves the robustness of a state-of-the-art
frame-based 6-DOF object tracker using our RGB-D-E pipeline.
Related papers
- E-3DGS: Gaussian Splatting with Exposure and Motion Events [29.042018288378447]
We propose E-3DGS, a novel event-based approach that partitions events into motion and exposure.
We introduce a novel integration of 3DGS with exposure events for high-quality reconstruction of explicit scene representations.
Our method is faster and delivers better reconstruction quality than event-based NeRF while being more cost-effective than NeRF methods.
arXiv Detail & Related papers (2024-10-22T13:17:20Z) - SpikeNVS: Enhancing Novel View Synthesis from Blurry Images via Spike Camera [78.20482568602993]
Conventional RGB cameras are susceptible to motion blur.
Neuromorphic cameras like event and spike cameras inherently capture more comprehensive temporal information.
Our design can enhance novel view synthesis across NeRF and 3DGS.
arXiv Detail & Related papers (2024-04-10T03:31:32Z) - Complementing Event Streams and RGB Frames for Hand Mesh Reconstruction [51.87279764576998]
We propose EvRGBHand -- the first approach for 3D hand mesh reconstruction with an event camera and an RGB camera compensating for each other.
EvRGBHand can tackle overexposure and motion blur issues in RGB-based HMR and foreground scarcity and background overflow issues in event-based HMR.
arXiv Detail & Related papers (2024-03-12T06:04:50Z) - TUMTraf Event: Calibration and Fusion Resulting in a Dataset for
Roadside Event-Based and RGB Cameras [14.57694345706197]
Event-based cameras are predestined for Intelligent Transportation Systems (ITS)
They provide very high temporal resolution and dynamic range, which can eliminate motion blur and improve detection performance at night.
However, event-based images lack color and texture compared to images from a conventional RGB camera.
arXiv Detail & Related papers (2024-01-16T16:25:37Z) - EventAid: Benchmarking Event-aided Image/Video Enhancement Algorithms
with Real-captured Hybrid Dataset [55.12137324648253]
Event cameras are emerging imaging technology that offers advantages over conventional frame-based imaging sensors in dynamic range and sensing speed.
This paper focuses on five event-aided image and video enhancement tasks.
arXiv Detail & Related papers (2023-12-13T15:42:04Z) - Robust e-NeRF: NeRF from Sparse & Noisy Events under Non-Uniform Motion [67.15935067326662]
Event cameras offer low power, low latency, high temporal resolution and high dynamic range.
NeRF is seen as the leading candidate for efficient and effective scene representation.
We propose Robust e-NeRF, a novel method to directly and robustly reconstruct NeRFs from moving event cameras.
arXiv Detail & Related papers (2023-09-15T17:52:08Z) - SDF-based RGB-D Camera Tracking in Neural Scene Representations [4.83420384410068]
We consider the problem of tracking the 6D pose of a moving RGB-D camera in a neural scene representation.
In particular, we propose to track an RGB-D camera using a signed distance field-based representation and show that compared to density-based representations, tracking can be sped up.
arXiv Detail & Related papers (2022-05-04T14:18:39Z) - E$^2$(GO)MOTION: Motion Augmented Event Stream for Egocentric Action
Recognition [21.199869051111367]
Event cameras capture pixel-level intensity changes in the form of "events"
N-EPIC-Kitchens is the first event-based camera extension of the large-scale EPIC-Kitchens dataset.
We show that event data provides a comparable performance to RGB and optical flow, yet without any additional flow computation at deploy time.
arXiv Detail & Related papers (2021-12-07T09:43:08Z) - ESL: Event-based Structured Light [62.77144631509817]
Event cameras are bio-inspired sensors providing significant advantages over standard cameras.
We propose a novel structured-light system using an event camera to tackle the problem of accurate and high-speed depth sensing.
arXiv Detail & Related papers (2021-11-30T15:47:39Z) - Moving Object Detection for Event-based vision using Graph Spectral
Clustering [6.354824287948164]
Moving object detection has been a central topic of discussion in computer vision for its wide range of applications.
We present an unsupervised Graph Spectral Clustering technique for Moving Object Detection in Event-based data.
We additionally show how the optimum number of moving objects can be automatically determined.
arXiv Detail & Related papers (2021-09-30T10:19:22Z) - TUM-VIE: The TUM Stereo Visual-Inertial Event Dataset [50.8779574716494]
Event cameras are bio-inspired vision sensors which measure per pixel brightness changes.
They offer numerous benefits over traditional, frame-based cameras, including low latency, high dynamic range, high temporal resolution and low power consumption.
To foster the development of 3D perception and navigation algorithms with event cameras, we present the TUM-VIE dataset.
arXiv Detail & Related papers (2021-08-16T19:53:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.