WaterClear-GS: Optical-Aware Gaussian Splatting for Underwater Reconstruction and Restoration
- URL: http://arxiv.org/abs/2601.19753v1
- Date: Tue, 27 Jan 2026 16:14:34 GMT
- Title: WaterClear-GS: Optical-Aware Gaussian Splatting for Underwater Reconstruction and Restoration
- Authors: Xinrui Zhang, Yufeng Wang, Shuangkang Fang, Zesheng Wang, Dacheng Qi, Wenrui Ding,
- Abstract summary: We introduce WaterClear-GS, the first pure 3DGS-based framework that integrates underwater optical properties into Gaussian primitives.<n>Our method employs a dual-branch optimization strategy to ensure underwater photometric consistency while naturally recovering water-free appearances.<n>Experiments on standard benchmarks and our newly collected dataset demonstrate that WaterClear-GS achieves outstanding performance on both novel view synthesis (NVS) and underwater image restoration tasks.
- Score: 11.520966034974697
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Underwater 3D reconstruction and appearance restoration are hindered by the complex optical properties of water, such as wavelength-dependent attenuation and scattering. Existing Neural Radiance Fields (NeRF)-based methods struggle with slow rendering speeds and suboptimal color restoration, while 3D Gaussian Splatting (3DGS) inherently lacks the capability to model complex volumetric scattering effects. To address these issues, we introduce WaterClear-GS, the first pure 3DGS-based framework that explicitly integrates underwater optical properties of local attenuation and scattering into Gaussian primitives, eliminating the need for an auxiliary medium network. Our method employs a dual-branch optimization strategy to ensure underwater photometric consistency while naturally recovering water-free appearances. This strategy is enhanced by depth-guided geometry regularization and perception-driven image loss, together with exposure constraints, spatially-adaptive regularization, and physically guided spectral regularization, which collectively enforce local 3D coherence and maintain natural visual perception. Experiments on standard benchmarks and our newly collected dataset demonstrate that WaterClear-GS achieves outstanding performance on both novel view synthesis (NVS) and underwater image restoration (UIR) tasks, while maintaining real-time rendering. The code will be available at https://buaaxrzhang.github.io/WaterClear-GS/.
Related papers
- Unifying Color and Lightness Correction with View-Adaptive Curve Adjustment for Robust 3D Novel View Synthesis [73.27997579020233]
We propose Luminance-GS++, a 3DGS-based framework for robust NVS under diverse illumination conditions.<n>Our method combines a globally view-adaptive lightness adjustment with a local pixel-wise residual refinement for precise color correction.
arXiv Detail & Related papers (2026-02-20T16:20:50Z) - Enhancing Underwater Light Field Images via Global Geometry-aware Diffusion Process [93.00033672476206]
GeoDiff-LF is a novel diffusion-based framework built upon SD-Turbo to enhance underwater 4-D LF imaging.<n>By integrating diffusion priors and LF geometry, GeoDiff-LF effectively mitigates color distortion in underwater scenes.
arXiv Detail & Related papers (2026-01-29T02:27:22Z) - From Restoration to Reconstruction: Rethinking 3D Gaussian Splatting for Underwater Scenes [13.730810237133822]
We propose textbfR-Splatting, a unified framework that bridges underwater image restoration (UIR) with 3D Gaussian Splatting (3DGS)<n>Our method integrates multiple enhanced views produced by diverse UIR models into a single reconstruction pipeline.<n>Experiments on Seathru-NeRF and our new BlueCoral3D dataset demonstrate that R-Splatting outperforms strong baselines in both rendering quality and geometric accuracy.
arXiv Detail & Related papers (2025-09-22T13:50:20Z) - UW-3DGS: Underwater 3D Reconstruction with Physics-Aware Gaussian Splatting [31.813166209083303]
We introduce UW-3DGS, a novel framework adapting 3D Gaussianting (3DGS) for robust underwater reconstruction.<n>Key innovations include (1) a plug-and-play learnable underwater image formation module using voxel-based regression for spatially varying attenuation and backscatter.<n>Experiments on SeaThru-NeRF and UWBundle datasets show superior performance, achieving PSNR of 27.604, SSIM of 0.868, and LPIPS of 0.104 on SeaThru-NeRF, with 65% reduction in floating artifacts.
arXiv Detail & Related papers (2025-08-08T09:36:32Z) - 3D-UIR: 3D Gaussian for Underwater 3D Scene Reconstruction via Physics Based Appearance-Medium Decoupling [30.985414238960466]
3D Gaussian Splatting (3DGS) offers real-time rendering capabilities, but struggles with underwater inhomogeneous environments.<n>We propose a physics-based framework that disentangles object appearance from water medium effects.<n>Our approach achieves both high-quality novel view synthesis and physically accurate scene restoration.
arXiv Detail & Related papers (2025-05-27T14:19:30Z) - RUSplatting: Robust 3D Gaussian Splatting for Sparse-View Underwater Scene Reconstruction [9.070464075411472]
This paper presents an enhanced Gaussian Splatting-based framework that improves both the visual quality and accuracy of deep underwater rendering.<n>We propose decoupled learning for RGB channels, guided by the physics of underwater attenuation, to enable more accurate colour restoration.<n>We release a newly collected dataset, Submerged3D, captured specifically in deep-sea environments.
arXiv Detail & Related papers (2025-05-21T16:42:15Z) - GUS-IR: Gaussian Splatting with Unified Shading for Inverse Rendering [83.69136534797686]
We present GUS-IR, a novel framework designed to address the inverse rendering problem for complicated scenes featuring rough and glossy surfaces.
This paper starts by analyzing and comparing two prominent shading techniques popularly used for inverse rendering, forward shading and deferred shading.
We propose a unified shading solution that combines the advantages of both techniques for better decomposition.
arXiv Detail & Related papers (2024-11-12T01:51:05Z) - GeoSplatting: Towards Geometry Guided Gaussian Splatting for Physically-based Inverse Rendering [69.67264955234494]
GeoSplatting is a novel approach that augments 3DGS with explicit geometry guidance for precise light transport modeling.<n>By differentiably constructing a surface-grounded 3DGS from an optimizable mesh, our approach leverages well-defined mesh normals and the opaque mesh surface.<n>This enhancement ensures precise material decomposition while preserving the efficiency and high-quality rendering capabilities of 3DGS.
arXiv Detail & Related papers (2024-10-31T17:57:07Z) - UW-GS: Distractor-Aware 3D Gaussian Splatting for Enhanced Underwater Scene Reconstruction [15.624536266709633]
3D Gaussian splatting (3DGS) offers the capability to achieve real-time high quality 3D scene rendering.<n>However, 3DGS assumes that the scene is in a clear medium environment and struggles to generate satisfactory representations in underwater scenes.<n>We introduce a novel Gaussian Splatting-based method, UW-GS, designed specifically for underwater applications.
arXiv Detail & Related papers (2024-10-02T13:08:56Z) - WaterHE-NeRF: Water-ray Tracing Neural Radiance Fields for Underwater
Scene Reconstruction [6.036702530679703]
We develop a new water-ray tracing field by Retinex theory that precisely encodes color, density, and illuminance attenuation in three-dimensional space.
WaterHE-NeRF, through its illuminance attenuation mechanism, generates both degraded and clear multi-view images.
arXiv Detail & Related papers (2023-12-12T02:55:14Z) - StableDreamer: Taming Noisy Score Distillation Sampling for Text-to-3D [88.66678730537777]
We present StableDreamer, a methodology incorporating three advances.
First, we formalize the equivalence of the SDS generative prior and a simple supervised L2 reconstruction loss.
Second, our analysis shows that while image-space diffusion contributes to geometric precision, latent-space diffusion is crucial for vivid color rendition.
arXiv Detail & Related papers (2023-12-02T02:27:58Z) - GS-IR: 3D Gaussian Splatting for Inverse Rendering [71.14234327414086]
We propose GS-IR, a novel inverse rendering approach based on 3D Gaussian Splatting (GS)
We extend GS, a top-performance representation for novel view synthesis, to estimate scene geometry, surface material, and environment illumination from multi-view images captured under unknown lighting conditions.
The flexible and expressive GS representation allows us to achieve fast and compact geometry reconstruction, photorealistic novel view synthesis, and effective physically-based rendering.
arXiv Detail & Related papers (2023-11-26T02:35:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.