EventSplat: 3D Gaussian Splatting from Moving Event Cameras for Real-time Rendering
- URL: http://arxiv.org/abs/2412.07293v1
- Date: Tue, 10 Dec 2024 08:23:58 GMT
- Title: EventSplat: 3D Gaussian Splatting from Moving Event Cameras for Real-time Rendering
- Authors: Toshiya Yura, Ashkan Mirzaei, Igor Gilitschenski,
- Abstract summary: Event cameras offer exceptional temporal resolution and a high dynamic range.
We introduce a method for using event camera data in novel view synthesis via Gaussian Splatting.
- Score: 7.392798832833857
- License:
- Abstract: We introduce a method for using event camera data in novel view synthesis via Gaussian Splatting. Event cameras offer exceptional temporal resolution and a high dynamic range. Leveraging these capabilities allows us to effectively address the novel view synthesis challenge in the presence of fast camera motion. For initialization of the optimization process, our approach uses prior knowledge encoded in an event-to-video model. We also use spline interpolation for obtaining high quality poses along the event camera trajectory. This enhances the reconstruction quality from fast-moving cameras while overcoming the computational limitations traditionally associated with event-based Neural Radiance Field (NeRF) methods. Our experimental evaluation demonstrates that our results achieve higher visual fidelity and better performance than existing event-based NeRF approaches while being an order of magnitude faster to render.
Related papers
- AE-NeRF: Augmenting Event-Based Neural Radiance Fields for Non-ideal Conditions and Larger Scene [31.142207770861457]
We propose AE-NeRF to address the challenges of learning event-based NeRF from non-ideal conditions.
Our method achieves a new state-of-the-art in event-based 3D reconstruction.
arXiv Detail & Related papers (2025-01-06T07:00:22Z) - E-3DGS: Gaussian Splatting with Exposure and Motion Events [29.042018288378447]
We propose E-3DGS, a novel event-based approach that partitions events into motion and exposure.
We introduce a novel integration of 3DGS with exposure events for high-quality reconstruction of explicit scene representations.
Our method is faster and delivers better reconstruction quality than event-based NeRF while being more cost-effective than NeRF methods.
arXiv Detail & Related papers (2024-10-22T13:17:20Z) - EventAid: Benchmarking Event-aided Image/Video Enhancement Algorithms
with Real-captured Hybrid Dataset [55.12137324648253]
Event cameras are emerging imaging technology that offers advantages over conventional frame-based imaging sensors in dynamic range and sensing speed.
This paper focuses on five event-aided image and video enhancement tasks.
arXiv Detail & Related papers (2023-12-13T15:42:04Z) - Robust e-NeRF: NeRF from Sparse & Noisy Events under Non-Uniform Motion [67.15935067326662]
Event cameras offer low power, low latency, high temporal resolution and high dynamic range.
NeRF is seen as the leading candidate for efficient and effective scene representation.
We propose Robust e-NeRF, a novel method to directly and robustly reconstruct NeRFs from moving event cameras.
arXiv Detail & Related papers (2023-09-15T17:52:08Z) - Deformable Neural Radiance Fields using RGB and Event Cameras [65.40527279809474]
We develop a novel method to model the deformable neural radiance fields using RGB and event cameras.
The proposed method uses the asynchronous stream of events and sparse RGB frames.
Experiments conducted on both realistically rendered graphics and real-world datasets demonstrate a significant benefit of the proposed method.
arXiv Detail & Related papers (2023-09-15T14:19:36Z) - EventNeRF: Neural Radiance Fields from a Single Colour Event Camera [81.19234142730326]
This paper proposes the first approach for 3D-consistent, dense and novel view synthesis using just a single colour event stream as input.
At its core is a neural radiance field trained entirely in a self-supervised manner from events while preserving the original resolution of the colour event channels.
We evaluate our method qualitatively and numerically on several challenging synthetic and real scenes and show that it produces significantly denser and more visually appealing renderings.
arXiv Detail & Related papers (2022-06-23T17:59:53Z) - Globally-Optimal Event Camera Motion Estimation [30.79931004393174]
Event cameras are bio-inspired sensors that perform well in HDR conditions and have high temporal resolution.
Event cameras measure asynchronous pixel-level changes and return them in a highly discretised format.
arXiv Detail & Related papers (2022-03-08T08:24:22Z) - Asynchronous Optimisation for Event-based Visual Odometry [53.59879499700895]
Event cameras open up new possibilities for robotic perception due to their low latency and high dynamic range.
We focus on event-based visual odometry (VO)
We propose an asynchronous structure-from-motion optimisation back-end.
arXiv Detail & Related papers (2022-03-02T11:28:47Z) - ESL: Event-based Structured Light [62.77144631509817]
Event cameras are bio-inspired sensors providing significant advantages over standard cameras.
We propose a novel structured-light system using an event camera to tackle the problem of accurate and high-speed depth sensing.
arXiv Detail & Related papers (2021-11-30T15:47:39Z) - Learning Event-Based Motion Deblurring [39.16921854492941]
Fast motion can be captured as events at high time rate for event-based cameras.
We show how its optimization can be unfolded with a novel end-to-end deep architecture.
The proposed approach achieves state-of-the-art reconstruction quality, and generalizes better to handling real-world motion blur.
arXiv Detail & Related papers (2020-04-13T07:01:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.