Gradient events: improved acquisition of visual information in event cameras
- URL: http://arxiv.org/abs/2409.01764v1
- Date: Tue, 3 Sep 2024 10:18:35 GMT
- Title: Gradient events: improved acquisition of visual information in event cameras
- Authors: Eero Lehtonen, Tuomo Komulainen, Ari Paasio, Mika Laiho,
- Abstract summary: We propose a new type of event, the gradient event, which benefits from the same properties as a conventional brightness event.
We show that the gradient event -based video reconstruction outperforms existing state-of-the-art brightness event -based methods by a significant margin.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: The current event cameras are bio-inspired sensors that respond to brightness changes in the scene asynchronously and independently for every pixel, and transmit these changes as ternary event streams. Event cameras have several benefits over conventional digital cameras, such as significantly higher temporal resolution and pixel bandwidth resulting in reduced motion blur, and very high dynamic range. However, they also introduce challenges such as the difficulty of applying existing computer vision algorithms to the output event streams, and the flood of uninformative events in the presence of oscillating light sources. Here we propose a new type of event, the gradient event, which benefits from the same properties as a conventional brightness event, but which is by design much less sensitive to oscillating light sources, and which enables considerably better grayscale frame reconstruction. We show that the gradient event -based video reconstruction outperforms existing state-of-the-art brightness event -based methods by a significant margin, when evaluated on publicly available event-to-video datasets. Our results show how gradient information can be used to significantly improve the acquisition of visual information by an event camera.
Related papers
- EventSplat: 3D Gaussian Splatting from Moving Event Cameras for Real-time Rendering [7.392798832833857]
Event cameras offer exceptional temporal resolution and a high dynamic range.
We introduce a method for using event camera data in novel view synthesis via Gaussian Splatting.
arXiv Detail & Related papers (2024-12-10T08:23:58Z) - Dynamic EventNeRF: Reconstructing General Dynamic Scenes from Multi-view Event Cameras [69.65147723239153]
Volumetric reconstruction of dynamic scenes is an important problem in computer vision.
It is especially challenging in poor lighting and with fast motion.
We propose the first method totemporally reconstruct a scene from sparse multi-view event streams and sparse RGB frames.
arXiv Detail & Related papers (2024-12-09T18:56:18Z) - EF-3DGS: Event-Aided Free-Trajectory 3D Gaussian Splatting [76.02450110026747]
Event cameras, inspired by biological vision, record pixel-wise intensity changes asynchronously with high temporal resolution.
We propose Event-Aided Free-Trajectory 3DGS, which seamlessly integrates the advantages of event cameras into 3DGS.
We evaluate our method on the public Tanks and Temples benchmark and a newly collected real-world dataset, RealEv-DAVIS.
arXiv Detail & Related papers (2024-10-20T13:44:24Z) - Generalized Event Cameras [15.730999915036705]
Event cameras capture the world at high time resolution and with minimal bandwidth requirements.
We design generalized event cameras that inherently preserve scene intensity in a bandwidth-efficient manner.
Our single-photon event cameras are capable of high-speed, high-fidelity imaging at low readout rates.
arXiv Detail & Related papers (2024-07-02T21:48:32Z) - EventAid: Benchmarking Event-aided Image/Video Enhancement Algorithms
with Real-captured Hybrid Dataset [55.12137324648253]
Event cameras are emerging imaging technology that offers advantages over conventional frame-based imaging sensors in dynamic range and sensing speed.
This paper focuses on five event-aided image and video enhancement tasks.
arXiv Detail & Related papers (2023-12-13T15:42:04Z) - Revisiting Event-based Video Frame Interpolation [49.27404719898305]
Dynamic vision sensors or event cameras provide rich complementary information for video frame.
estimating optical flow from events is arguably more difficult than from RGB information.
We propose a divide-and-conquer strategy in which event-based intermediate frame synthesis happens incrementally in multiple simplified stages.
arXiv Detail & Related papers (2023-07-24T06:51:07Z) - E$^2$(GO)MOTION: Motion Augmented Event Stream for Egocentric Action
Recognition [21.199869051111367]
Event cameras capture pixel-level intensity changes in the form of "events"
N-EPIC-Kitchens is the first event-based camera extension of the large-scale EPIC-Kitchens dataset.
We show that event data provides a comparable performance to RGB and optical flow, yet without any additional flow computation at deploy time.
arXiv Detail & Related papers (2021-12-07T09:43:08Z) - MEFNet: Multi-scale Event Fusion Network for Motion Deblurring [62.60878284671317]
Traditional frame-based cameras inevitably suffer from motion blur due to long exposure times.
As a kind of bio-inspired camera, the event camera records the intensity changes in an asynchronous way with high temporal resolution.
In this paper, we rethink the event-based image deblurring problem and unfold it into an end-to-end two-stage image restoration network.
arXiv Detail & Related papers (2021-11-30T23:18:35Z) - Reducing the Sim-to-Real Gap for Event Cameras [64.89183456212069]
Event cameras are paradigm-shifting novel sensors that report asynchronous, per-pixel brightness changes called 'events' with unparalleled low latency.
Recent work has demonstrated impressive results using Convolutional Neural Networks (CNNs) for video reconstruction and optic flow with events.
We present strategies for improving training data for event based CNNs that result in 20-40% boost in performance of existing video reconstruction networks.
arXiv Detail & Related papers (2020-03-20T02:44:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.