Gaussian Splashing: Direct Volumetric Rendering Underwater
- URL: http://arxiv.org/abs/2411.19588v1
- Date: Fri, 29 Nov 2024 10:04:38 GMT
- Title: Gaussian Splashing: Direct Volumetric Rendering Underwater
- Authors: Nir Mualem, Roy Amoyal, Oren Freifeld, Derya Akkaynak,
- Abstract summary: We present a new method that takes only a few minutes for reconstruction and renders novel underwater scenes at 140 FPS.
Named Gaussian Splashing, our method unifies the strengths and speed of 3DGS with an image formation model for capturing scattering.
It reveals distant scene details with far greater clarity than other methods, dramatically improving reconstructed and rendered images.
- Score: 6.2122699483618
- License:
- Abstract: In underwater images, most useful features are occluded by water. The extent of the occlusion depends on imaging geometry and can vary even across a sequence of burst images. As a result, 3D reconstruction methods robust on in-air scenes, like Neural Radiance Field methods (NeRFs) or 3D Gaussian Splatting (3DGS), fail on underwater scenes. While a recent underwater adaptation of NeRFs achieved state-of-the-art results, it is impractically slow: reconstruction takes hours and its rendering rate, in frames per second (FPS), is less than 1. Here, we present a new method that takes only a few minutes for reconstruction and renders novel underwater scenes at 140 FPS. Named Gaussian Splashing, our method unifies the strengths and speed of 3DGS with an image formation model for capturing scattering, introducing innovations in the rendering and depth estimation procedures and in the 3DGS loss function. Despite the complexities of underwater adaptation, our method produces images at unparalleled speeds with superior details. Moreover, it reveals distant scene details with far greater clarity than other methods, dramatically improving reconstructed and rendered images. We demonstrate results on existing datasets and a new dataset we have collected. Additional visual results are available at: https://bgu-cs-vil.github.io/gaussiansplashingUW.github.io/ .
Related papers
- DehazeGS: Seeing Through Fog with 3D Gaussian Splatting [17.119969983512533]
We introduce DehazeGS, a method capable of decomposing and rendering a fog-free background from participating media.
Experiments on both synthetic and real-world foggy datasets demonstrate that DehazeGS achieves state-of-the-art performance.
arXiv Detail & Related papers (2025-01-07T09:47:46Z) - Speedy-Splat: Fast 3D Gaussian Splatting with Sparse Pixels and Sparse Primitives [60.217580865237835]
3D Gaussian Splatting (3D-GS) is a recent 3D scene reconstruction technique that enables real-time rendering of novel views by modeling scenes as parametric point clouds of differentiable 3D Gaussians.
We identify and address two key inefficiencies in 3D-GS, achieving substantial improvements in rendering speed, model size, and training time.
Our Speedy-Splat approach combines these techniques to accelerate average rendering speed by a drastic $6.71times$ across scenes from the Mip-NeRF 360, Tanks & Temples, and Deep Blending datasets with $10.6times$ fewer primitives than 3
arXiv Detail & Related papers (2024-11-30T20:25:56Z) - SeaSplat: Representing Underwater Scenes with 3D Gaussian Splatting and a Physically Grounded Image Formation Model [11.57677379828992]
We introduce SeaSplat, a method to enable real-time rendering of underwater scenes leveraging recent advances in 3D radiance fields.
Applying SeaSplat to the real-world scenes from SeaThru-NeRF dataset, a scene collected by an underwater vehicle in the US Virgin Islands.
We show that the underwater image formation helps learn scene structure, with better depth maps, as well as show that our improvements maintain the significant computational improvements afforded by leveraging a 3D Gaussian representation.
arXiv Detail & Related papers (2024-09-25T20:45:19Z) - WaterSplatting: Fast Underwater 3D Scene Reconstruction Using Gaussian Splatting [39.58317527488534]
We propose a novel approach that fuses volumetric rendering with 3DGS to handle underwater data effectively.
Our method outperforms state-of-the-art NeRF-based methods in rendering quality on the underwater SeaThru-NeRF dataset.
arXiv Detail & Related papers (2024-08-15T15:16:49Z) - RecGS: Removing Water Caustic with Recurrent Gaussian Splatting [13.87415686123919]
Water caustics are commonly observed in seafloor imaging data from shallow-water areas.
Traditional methods that remove caustic patterns from images often rely on 2D filtering or pre-training on an annotated dataset.
We present a novel method Recurrent Gaussian Splatting (RecGS), which takes advantage of today's photorealistic 3D reconstruction technology.
arXiv Detail & Related papers (2024-07-14T20:24:44Z) - PUP 3D-GS: Principled Uncertainty Pruning for 3D Gaussian Splatting [59.277480452459315]
We propose a principled sensitivity pruning score that preserves visual fidelity and foreground details at significantly higher compression ratios.
We also propose a multi-round prune-refine pipeline that can be applied to any pretrained 3D-GS model without changing its training pipeline.
arXiv Detail & Related papers (2024-06-14T17:53:55Z) - BAD-Gaussians: Bundle Adjusted Deblur Gaussian Splatting [8.380954205255104]
BAD-Gaussians is a novel approach to handle severe motion-blurred images with inaccurate camera poses.
Our method achieves superior rendering quality compared to previous state-of-the-art deblur neural rendering methods.
arXiv Detail & Related papers (2024-03-18T14:43:04Z) - VastGaussian: Vast 3D Gaussians for Large Scene Reconstruction [59.40711222096875]
We present VastGaussian, the first method for high-quality reconstruction and real-time rendering on large scenes based on 3D Gaussian Splatting.
Our approach outperforms existing NeRF-based methods and achieves state-of-the-art results on multiple large scene datasets.
arXiv Detail & Related papers (2024-02-27T11:40:50Z) - Splatter Image: Ultra-Fast Single-View 3D Reconstruction [67.96212093828179]
Splatter Image is based on Gaussian Splatting, which allows fast and high-quality reconstruction of 3D scenes from multiple images.
We learn a neural network that, at test time, performs reconstruction in a feed-forward manner, at 38 FPS.
On several synthetic, real, multi-category and large-scale benchmark datasets, we achieve better results in terms of PSNR, LPIPS, and other metrics while training and evaluating much faster than prior works.
arXiv Detail & Related papers (2023-12-20T16:14:58Z) - Scaffold-GS: Structured 3D Gaussians for View-Adaptive Rendering [71.44349029439944]
Recent 3D Gaussian Splatting method has achieved the state-of-the-art rendering quality and speed.
We introduce Scaffold-GS, which uses anchor points to distribute local 3D Gaussians.
We show that our method effectively reduces redundant Gaussians while delivering high-quality rendering.
arXiv Detail & Related papers (2023-11-30T17:58:57Z) - Nerfbusters: Removing Ghostly Artifacts from Casually Captured NeRFs [78.75872372856597]
Casually captured Neural Radiance Fields (NeRFs) suffer from artifacts such as floaters or flawed geometry when rendered outside the camera trajectory.
We propose a new dataset and evaluation procedure, where two camera trajectories are recorded of the scene.
We show that this data-driven prior removes floaters and improves scene geometry for casual captures.
arXiv Detail & Related papers (2023-04-20T17:59:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.