ExBluRF: Efficient Radiance Fields for Extreme Motion Blurred Images
- URL: http://arxiv.org/abs/2309.08957v3
- Date: Sat, 24 Feb 2024 12:36:49 GMT
- Title: ExBluRF: Efficient Radiance Fields for Extreme Motion Blurred Images
- Authors: Dongwoo Lee, Jeongtaek Oh, Jaesung Rim, Sunghyun Cho and Kyoung Mu Lee
- Abstract summary: We present ExBluRF, a novel view synthesis method for extreme motion blurred images.
Our approach consists of two main components: 6-DOF camera trajectory-based motion blur formulation and voxel-based radiance fields.
Compared with the existing works, our approach restores much sharper 3D scenes with the order of 10 times less training time and GPU memory consumption.
- Score: 58.24910105459957
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: We present ExBluRF, a novel view synthesis method for extreme motion blurred
images based on efficient radiance fields optimization. Our approach consists
of two main components: 6-DOF camera trajectory-based motion blur formulation
and voxel-based radiance fields. From extremely blurred images, we optimize the
sharp radiance fields by jointly estimating the camera trajectories that
generate the blurry images. In training, multiple rays along the camera
trajectory are accumulated to reconstruct single blurry color, which is
equivalent to the physical motion blur operation. We minimize the
photo-consistency loss on blurred image space and obtain the sharp radiance
fields with camera trajectories that explain the blur of all images. The joint
optimization on the blurred image space demands painfully increasing
computation and resources proportional to the blur size. Our method solves this
problem by replacing the MLP-based framework to low-dimensional 6-DOF camera
poses and voxel-based radiance fields. Compared with the existing works, our
approach restores much sharper 3D scenes from challenging motion blurred views
with the order of 10 times less training time and GPU memory consumption.
Related papers
- Motion Blur Decomposition with Cross-shutter Guidance [33.72961622720793]
Motion blur is an artifact under insufficient illumination where exposure time has to be prolonged so as to collect more photons for a bright enough image.
Recent researches have aimed at decomposing a blurry image into multiple sharp images with spatial and temporal coherence.
We propose to utilize the ordered scanline-wise delay in a rolling shutter image to robustify motion decomposition of a single blurry image.
arXiv Detail & Related papers (2024-04-01T13:55:40Z) - Gaussian Splatting on the Move: Blur and Rolling Shutter Compensation for Natural Camera Motion [25.54868552979793]
We present a method that adapts to camera motion and allows high-quality scene reconstruction with handheld video data.
Our results with both synthetic and real data demonstrate superior performance in mitigating camera motion over existing methods.
arXiv Detail & Related papers (2024-03-20T06:19:41Z) - Adaptive Window Pruning for Efficient Local Motion Deblurring [81.35217764881048]
Local motion blur commonly occurs in real-world photography due to the mixing between moving objects and stationary backgrounds during exposure.
Existing image deblurring methods predominantly focus on global deblurring.
This paper aims to adaptively and efficiently restore high-resolution locally blurred images.
arXiv Detail & Related papers (2023-06-25T15:24:00Z) - Progressively Optimized Local Radiance Fields for Robust View Synthesis [76.55036080270347]
We present an algorithm for reconstructing the radiance field of a large-scale scene from a single casually captured video.
For handling unknown poses, we jointly estimate the camera poses with radiance field in a progressive manner.
For handling large unbounded scenes, we dynamically allocate new local radiance fields trained with frames within a temporal window.
arXiv Detail & Related papers (2023-03-24T04:03:55Z) - PDRF: Progressively Deblurring Radiance Field for Fast and Robust Scene
Reconstruction from Blurry Images [75.87721926918874]
We present Progressively Deblurring Radiance Field (PDRF)
PDRF is a novel approach to efficiently reconstruct high quality radiance fields from blurry images.
We show that PDRF is 15X faster than previous State-of-The-Art scene reconstruction methods.
arXiv Detail & Related papers (2022-08-17T03:42:29Z) - High Dynamic Range and Super-Resolution from Raw Image Bursts [52.341483902624006]
This paper introduces the first approach to reconstruct high-resolution, high-dynamic range color images from raw photographic bursts captured by a handheld camera with exposure bracketing.
The proposed algorithm is fast, with low memory requirements compared to state-of-the-art learning-based approaches to image restoration.
Experiments demonstrate its excellent performance with super-resolution factors of up to $times 4$ on real photographs taken in the wild with hand-held cameras.
arXiv Detail & Related papers (2022-07-29T13:31:28Z) - Learning Spatially Varying Pixel Exposures for Motion Deblurring [49.07867902677453]
We present a novel approach of leveraging spatially varying pixel exposures for motion deblurring.
Our work illustrates the promising role that focal-plane sensor--processors can play in the future of computational imaging.
arXiv Detail & Related papers (2022-04-14T23:41:49Z) - ROSEFusion: Random Optimization for Online Dense Reconstruction under
Fast Camera Motion [15.873973449155313]
Reconstruction based on RGB-D sequences has thus far been restrained to relatively slow camera motions (1m/s)
Fast motion brings two challenges to depth fusion: 1) the high nonlinearity of camera pose optimization due to large inter-frame rotations and 2) the lack of reliably trackable features due to motion blur.
We propose to tackle the difficulties of fast-motion camera tracking in the absence of inertial measurements using random optimization.
Thanks to the efficient template-based particle set evolution and the effective fitness function, our method attains good quality pose tracking under fast camera motion (up to 4m/s) in a
arXiv Detail & Related papers (2021-05-12T11:37:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.