DeblurSplat: SfM-free 3D Gaussian Splatting with Event Camera for Robust Deblurring
- URL: http://arxiv.org/abs/2509.18898v1
- Date: Tue, 23 Sep 2025 11:21:54 GMT
- Title: DeblurSplat: SfM-free 3D Gaussian Splatting with Event Camera for Robust Deblurring
- Authors: Pengteng Li, Yunfan Lu, Pinhao Song, Weiyu Guo, Huizai Yao, F. Richard Yu, Hui Xiong,
- Abstract summary: We propose the first Structure-from-Motion (SfM)-free deblurring 3D Gaussian Splatting method via event camera, dubbed DeSplat.<n>We leverage the pretrained capability of the dense stereo module (DUSt3R) to directly obtain accurate initial point clouds from blurred images.
- Score: 50.21760380168387
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we propose the first Structure-from-Motion (SfM)-free deblurring 3D Gaussian Splatting method via event camera, dubbed DeblurSplat. We address the motion-deblurring problem in two ways. First, we leverage the pretrained capability of the dense stereo module (DUSt3R) to directly obtain accurate initial point clouds from blurred images. Without calculating camera poses as an intermediate result, we avoid the cumulative errors transfer from inaccurate camera poses to the initial point clouds' positions. Second, we introduce the event stream into the deblur pipeline for its high sensitivity to dynamic change. By decoding the latent sharp images from the event stream and blurred images, we can provide a fine-grained supervision signal for scene reconstruction optimization. Extensive experiments across a range of scenes demonstrate that DeblurSplat not only excels in generating high-fidelity novel views but also achieves significant rendering efficiency compared to the SOTAs in deblur 3D-GS.
Related papers
- iGaussian: Real-Time Camera Pose Estimation via Feed-Forward 3D Gaussian Splatting Inversion [62.09575122593993]
iGaussian is a two-stage feed-forward framework that achieves real-time camera pose estimation through direct 3D Gaussian inversion.<n> Experimental results on the NeRF Synthetic, Mip-NeRF 360, and T&T+DB datasets demonstrate a significant performance improvement over previous methods.
arXiv Detail & Related papers (2025-11-18T05:22:22Z) - Blur2seq: Blind Deblurring and Camera Trajectory Estimation from a Single Camera Motion-blurred Image [2.842028685390758]
Motion blur caused by camera shake, particularly under large or rotational movements, is a major challenge in image restoration.<n>We propose a deep learning framework that jointly estimates the latent sharp image and the underlying camera motion trajectory from a single blurry image.<n>Our method achieves state-of-the-art performance on both synthetic and real datasets.
arXiv Detail & Related papers (2025-10-23T13:26:07Z) - BSGS: Bi-stage 3D Gaussian Splatting for Camera Motion Deblurring [32.5099324617965]
Bi-Stage 3D Gaussian Splatting is a novel framework to reconstruct 3D scenes from motion-blurred images.<n>We propose a subframe gradient aggregation strategy to optimize both stages.<n>Experiments verify the effectiveness of our proposed deblurring method and show its superiority over the state of the arts.
arXiv Detail & Related papers (2025-10-14T13:26:56Z) - RobustSplat: Decoupling Densification and Dynamics for Transient-Free 3DGS [79.15416002879239]
3D Gaussian Splatting has gained significant attention for its real-time, photo-realistic rendering in novel-view synthesis and 3D modeling.<n>Existing methods struggle with accurately modeling scenes affected by transient objects, leading to artifacts in the rendered images.<n>We propose RobustSplat, a robust solution based on two critical designs.
arXiv Detail & Related papers (2025-06-03T11:13:48Z) - BeSplat: Gaussian Splatting from a Single Blurry Image and Event Stream [13.649334929746413]
3D Gaussian Splatting (3DGS) has effectively addressed key challenges, such as long training times and slow rendering speeds.<n>We demonstrate the recovery of sharp radiance field (Gaussian splats) from a single motion-blurred image and its corresponding event stream.
arXiv Detail & Related papers (2024-12-26T22:35:29Z) - EF-3DGS: Event-Aided Free-Trajectory 3D Gaussian Splatting [72.60992807941885]
Event cameras, inspired by biological vision, record pixel-wise intensity changes asynchronously with high temporal resolution.<n>We propose Event-Aided Free-Trajectory 3DGS, which seamlessly integrates the advantages of event cameras into 3DGS.<n>We evaluate our method on the public Tanks and Temples benchmark and a newly collected real-world dataset, RealEv-DAVIS.
arXiv Detail & Related papers (2024-10-20T13:44:24Z) - IncEventGS: Pose-Free Gaussian Splatting from a Single Event Camera [6.879406129086464]
IncEventGS is an incremental 3D Gaussian splatting reconstruction algorithm with a single event camera.<n>We exploit the tracking and mapping paradigm of conventional SLAM pipelines for IncEventGS.
arXiv Detail & Related papers (2024-10-10T16:54:23Z) - Event3DGS: Event-Based 3D Gaussian Splatting for High-Speed Robot Egomotion [54.197343533492486]
Event3DGS can reconstruct high-fidelity 3D structure and appearance under high-speed egomotion.
Experiments on multiple synthetic and real-world datasets demonstrate the superiority of Event3DGS compared with existing event-based dense 3D scene reconstruction frameworks.
Our framework also allows one to incorporate a few motion-blurred frame-based measurements into the reconstruction process to further improve appearance fidelity without loss of structural accuracy.
arXiv Detail & Related papers (2024-06-05T06:06:03Z) - DeblurGS: Gaussian Splatting for Camera Motion Blur [45.13521168573883]
We propose DeblurGS, a method to optimize sharp 3D Gaussian Splatting from motion-blurred images.
We restore a fine-grained sharp scene by leveraging the remarkable reconstruction capability of 3D Gaussian Splatting.
Our approach estimates the 6-Degree-of-Freedom camera motion for each blurry observation and synthesizes corresponding blurry renderings.
arXiv Detail & Related papers (2024-04-17T13:14:52Z) - Towards Nonlinear-Motion-Aware and Occlusion-Robust Rolling Shutter
Correction [54.00007868515432]
Existing methods face challenges in estimating the accurate correction field due to the uniform velocity assumption.
We propose a geometry-based Quadratic Rolling Shutter (QRS) motion solver, which precisely estimates the high-order correction field of individual pixels.
Our method surpasses the state-of-the-art by +4.98, +0.77, and +4.33 of PSNR on Carla-RS, Fastec-RS, and BS-RSC datasets, respectively.
arXiv Detail & Related papers (2023-03-31T15:09:18Z) - Shakes on a Plane: Unsupervised Depth Estimation from Unstabilized
Photography [54.36608424943729]
We show that in a ''long-burst'', forty-two 12-megapixel RAW frames captured in a two-second sequence, there is enough parallax information from natural hand tremor alone to recover high-quality scene depth.
We devise a test-time optimization approach that fits a neural RGB-D representation to long-burst data and simultaneously estimates scene depth and camera motion.
arXiv Detail & Related papers (2022-12-22T18:54:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.