OceanSplat: Object-aware Gaussian Splatting with Trinocular View Consistency for Underwater Scene Reconstruction
- URL: http://arxiv.org/abs/2601.04984v1
- Date: Thu, 08 Jan 2026 14:38:39 GMT
- Title: OceanSplat: Object-aware Gaussian Splatting with Trinocular View Consistency for Underwater Scene Reconstruction
- Authors: Minseong Kweon, Jinsun Park,
- Abstract summary: OceanSplat is a novel 3D Gaussian Splatting-based approach for representing 3D geometry in underwater scenes.<n>We show that OceanSplat substantially outperforms existing methods for both scene reconstruction and restoration in scattering media.
- Score: 4.325717217536016
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce OceanSplat, a novel 3D Gaussian Splatting-based approach for accurately representing 3D geometry in underwater scenes. To overcome multi-view inconsistencies caused by underwater optical degradation, our method enforces trinocular view consistency by rendering horizontally and vertically translated camera views relative to each input view and aligning them via inverse warping. Furthermore, these translated camera views are used to derive a synthetic epipolar depth prior through triangulation, which serves as a self-supervised depth regularizer. These geometric constraints facilitate the spatial optimization of 3D Gaussians and preserve scene structure in underwater environments. We also propose a depth-aware alpha adjustment that modulates the opacity of 3D Gaussians during early training based on their $z$-component and viewing direction, deterring the formation of medium-induced primitives. With our contributions, 3D Gaussians are disentangled from the scattering medium, enabling robust representation of object geometry and significantly reducing floating artifacts in reconstructed underwater scenes. Experiments on real-world underwater and simulated scenes demonstrate that OceanSplat substantially outperforms existing methods for both scene reconstruction and restoration in scattering media.
Related papers
- PFDepth: Heterogeneous Pinhole-Fisheye Joint Depth Estimation via Distortion-aware Gaussian-Splatted Volumetric Fusion [61.6340987158734]
We present the first pinhole-fisheye framework for heterogeneous multi-view depth estimation, PFDepth.<n> PFDepth employs a unified architecture capable of processing arbitrary combinations of pinhole and fisheye cameras with varied intrinsics and extrinsics.<n>We show that PFDepth sets a state-of-the-art performance on KITTI-360 and RealHet datasets over current mainstream depth networks.
arXiv Detail & Related papers (2025-09-30T09:38:59Z) - 3D Gaussian Flats: Hybrid 2D/3D Photometric Scene Reconstruction [62.84879632157956]
We propose a novel hybrid 2D/3D representation that jointly optimize constrained planar (2D) Gaussians for modeling flat surfaces and freeform (3D) Gaussians for the rest of the scene.<n>Our end-to-end approach dynamically detects and refines planar regions, improving both visual fidelity and geometric accuracy.<n>It achieves state-of-the-art depth estimation on ScanNet++ and ScanNetv2, and excels at mesh extraction without overfitting to a specific camera model.
arXiv Detail & Related papers (2025-09-19T21:04:36Z) - Plenodium: UnderWater 3D Scene Reconstruction with Plenoptic Medium Representation [31.47797579690604]
We present Plenodium, a 3D representation framework capable of jointly modeling both objects and participating media.<n>In contrast to existing medium representations that rely solely on view-dependent modeling, our novel plenoptic medium representation incorporates both directional and positional information.<n>Experiments on real-world underwater datasets demonstrate that our method achieves significant improvements in 3D reconstruction.
arXiv Detail & Related papers (2025-05-27T14:37:58Z) - 3D-UIR: 3D Gaussian for Underwater 3D Scene Reconstruction via Physics Based Appearance-Medium Decoupling [30.985414238960466]
3D Gaussian Splatting (3DGS) offers real-time rendering capabilities, but struggles with underwater inhomogeneous environments.<n>We propose a physics-based framework that disentangles object appearance from water medium effects.<n>Our approach achieves both high-quality novel view synthesis and physically accurate scene restoration.
arXiv Detail & Related papers (2025-05-27T14:19:30Z) - RUSplatting: Robust 3D Gaussian Splatting for Sparse-View Underwater Scene Reconstruction [9.070464075411472]
This paper presents an enhanced Gaussian Splatting-based framework that improves both the visual quality and accuracy of deep underwater rendering.<n>We propose decoupled learning for RGB channels, guided by the physics of underwater attenuation, to enable more accurate colour restoration.<n>We release a newly collected dataset, Submerged3D, captured specifically in deep-sea environments.
arXiv Detail & Related papers (2025-05-21T16:42:15Z) - AquaGS: Fast Underwater Scene Reconstruction with SfM-Free Gaussian Splatting [4.0317256978754505]
We introduce AquaGS, an SfM-free underwater scene reconstruction model based on the SeaThru algorithm.<n>Our model can complete high-precision reconstruction in 30 seconds with only 3 image inputs.
arXiv Detail & Related papers (2025-05-03T12:05:57Z) - MonoGSDF: Exploring Monocular Geometric Cues for Gaussian Splatting-Guided Implicit Surface Reconstruction [86.87464903285208]
We introduce MonoGSDF, a novel method that couples primitives with a neural Signed Distance Field (SDF) for high-quality reconstruction.<n>To handle arbitrary-scale scenes, we propose a scaling strategy for robust generalization.<n>Experiments on real-world datasets outperforms prior methods while maintaining efficiency.
arXiv Detail & Related papers (2024-11-25T20:07:07Z) - PF3plat: Pose-Free Feed-Forward 3D Gaussian Splatting [54.7468067660037]
PF3plat sets a new state-of-the-art across all benchmarks, supported by comprehensive ablation studies validating our design choices.<n>Our framework capitalizes on fast speed, scalability, and high-quality 3D reconstruction and view synthesis capabilities of 3DGS.
arXiv Detail & Related papers (2024-10-29T15:28:15Z) - Mode-GS: Monocular Depth Guided Anchored 3D Gaussian Splatting for Robust Ground-View Scene Rendering [47.879695094904015]
We present a novelview rendering algorithm, Mode-GS, for ground-robot trajectory datasets.
Our approach is based on using anchored Gaussian splats, which are designed to overcome the limitations of existing 3D Gaussian splatting algorithms.
Our method results in improved rendering performance, based on PSNR, SSIM, and LPIPS metrics, in ground scenes with free trajectory patterns.
arXiv Detail & Related papers (2024-10-06T23:01:57Z) - GEOcc: Geometrically Enhanced 3D Occupancy Network with Implicit-Explicit Depth Fusion and Contextual Self-Supervision [49.839374549646884]
This paper presents GEOcc, a Geometric-Enhanced Occupancy network tailored for vision-only surround-view perception.<n>Our approach achieves State-Of-The-Art performance on the Occ3D-nuScenes dataset with the least image resolution needed and the most weightless image backbone.
arXiv Detail & Related papers (2024-05-17T07:31:20Z) - Underwater 3D Reconstruction Using Light Fields [41.23269538226359]
We present an underwater 3D reconstruction solution using light field cameras.
We first develop a light field camera calibration algorithm that simultaneously estimates the camera parameters.
We then design a novel depth estimation algorithm for 3D reconstruction.
arXiv Detail & Related papers (2021-09-05T16:23:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.