Underwater 3D Reconstruction Using Light Fields
- URL: http://arxiv.org/abs/2109.02116v1
- Date: Sun, 5 Sep 2021 16:23:39 GMT
- Title: Underwater 3D Reconstruction Using Light Fields
- Authors: Yuqi Ding, Yu Ji, Jingyi Yu, Jinwei Ye
- Abstract summary: We present an underwater 3D reconstruction solution using light field cameras.
We first develop a light field camera calibration algorithm that simultaneously estimates the camera parameters.
We then design a novel depth estimation algorithm for 3D reconstruction.
- Score: 41.23269538226359
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Underwater 3D reconstruction is challenging due to the refraction of light at
the water-air interface (most electronic devices cannot be directly submerged
in water). In this paper, we present an underwater 3D reconstruction solution
using light field cameras. We first develop a light field camera calibration
algorithm that simultaneously estimates the camera parameters and the geometry
of the water-air interface. We then design a novel depth estimation algorithm
for 3D reconstruction. Specifically, we match correspondences on curved
epipolar lines caused by water refraction. We also observe that the
view-dependent specular reflection is very weak in the underwater environment,
resulting the angularly sampled rays in light field has uniform intensity. We
therefore propose an angular uniformity constraint for depth optimization. We
also develop a fast algorithm for locating the angular patches in presence of
non-linear light paths. Extensive synthetic and real experiments demonstrate
that our method can perform underwater 3D reconstruction with high accuracy.
Related papers
- WaterClear-GS: Optical-Aware Gaussian Splatting for Underwater Reconstruction and Restoration [11.520966034974697]
We introduce WaterClear-GS, the first pure 3DGS-based framework that integrates underwater optical properties into Gaussian primitives.<n>Our method employs a dual-branch optimization strategy to ensure underwater photometric consistency while naturally recovering water-free appearances.<n>Experiments on standard benchmarks and our newly collected dataset demonstrate that WaterClear-GS achieves outstanding performance on both novel view synthesis (NVS) and underwater image restoration tasks.
arXiv Detail & Related papers (2026-01-27T16:14:34Z) - OceanSplat: Object-aware Gaussian Splatting with Trinocular View Consistency for Underwater Scene Reconstruction [4.325717217536016]
OceanSplat is a novel 3D Gaussian Splatting-based approach for representing 3D geometry in underwater scenes.<n>We show that OceanSplat substantially outperforms existing methods for both scene reconstruction and restoration in scattering media.
arXiv Detail & Related papers (2026-01-08T14:38:39Z) - From Restoration to Reconstruction: Rethinking 3D Gaussian Splatting for Underwater Scenes [13.730810237133822]
We propose textbfR-Splatting, a unified framework that bridges underwater image restoration (UIR) with 3D Gaussian Splatting (3DGS)<n>Our method integrates multiple enhanced views produced by diverse UIR models into a single reconstruction pipeline.<n>Experiments on Seathru-NeRF and our new BlueCoral3D dataset demonstrate that R-Splatting outperforms strong baselines in both rendering quality and geometric accuracy.
arXiv Detail & Related papers (2025-09-22T13:50:20Z) - 3D-UIR: 3D Gaussian for Underwater 3D Scene Reconstruction via Physics Based Appearance-Medium Decoupling [30.985414238960466]
3D Gaussian Splatting (3DGS) offers real-time rendering capabilities, but struggles with underwater inhomogeneous environments.<n>We propose a physics-based framework that disentangles object appearance from water medium effects.<n>Our approach achieves both high-quality novel view synthesis and physically accurate scene restoration.
arXiv Detail & Related papers (2025-05-27T14:19:30Z) - DoF-Gaussian: Controllable Depth-of-Field for 3D Gaussian Splatting [52.52398576505268]
We introduce DoF-Gaussian, a controllable depth-of-field method for 3D-GS.
We develop a lens-based imaging model based on geometric optics principles to control DoF effects.
Our framework is customizable and supports various interactive applications.
arXiv Detail & Related papers (2025-03-02T05:57:57Z) - NeuroPump: Simultaneous Geometric and Color Rectification for Underwater Images [52.863935209616635]
Underwater image restoration aims to remove geometric and color distortions due to water refraction, absorption and scattering.
We propose NeuroPump, a self-supervised method to simultaneously optimize and rectify underwater geometry and color as if water were pumped out.
arXiv Detail & Related papers (2024-12-20T13:40:28Z) - Gaussian Splashing: Direct Volumetric Rendering Underwater [6.2122699483618]
We present a new method that takes only a few minutes for reconstruction and renders novel underwater scenes at 140 FPS.
Named Gaussian Splashing, our method unifies the strengths and speed of 3DGS with an image formation model for capturing scattering.
It reveals distant scene details with far greater clarity than other methods, dramatically improving reconstructed and rendered images.
arXiv Detail & Related papers (2024-11-29T10:04:38Z) - PGSR: Planar-based Gaussian Splatting for Efficient and High-Fidelity Surface Reconstruction [37.14913599050765]
We propose a fast planar-based Gaussian splatting reconstruction representation (PGSR) to achieve high-fidelity surface reconstruction.
We then introduce single-view geometric, multi-view photometric, and geometric regularization to preserve global geometric accuracy.
Our method achieves fast training and rendering while maintaining high-fidelity rendering and geometric reconstruction, outperforming 3DGS-based and NeRF-based methods.
arXiv Detail & Related papers (2024-06-10T17:59:01Z) - Phase Guided Light Field for Spatial-Depth High Resolution 3D Imaging [36.208109063579066]
On 3D imaging, light field cameras typically are of single shot, and they heavily suffer from low spatial resolution and depth accuracy.
We propose a phase guided light field algorithm to significantly improve both the spatial and depth resolutions for off-the-shelf light field cameras.
arXiv Detail & Related papers (2023-11-17T15:08:15Z) - Spatiotemporally Consistent HDR Indoor Lighting Estimation [66.26786775252592]
We propose a physically-motivated deep learning framework to solve the indoor lighting estimation problem.
Given a single LDR image with a depth map, our method predicts spatially consistent lighting at any given image position.
Our framework achieves photorealistic lighting prediction with higher quality compared to state-of-the-art single-image or video-based methods.
arXiv Detail & Related papers (2023-05-07T20:36:29Z) - $PC^2$: Projection-Conditioned Point Cloud Diffusion for Single-Image 3D
Reconstruction [97.06927852165464]
Reconstructing the 3D shape of an object from a single RGB image is a long-standing and highly challenging problem in computer vision.
We propose a novel method for single-image 3D reconstruction which generates a sparse point cloud via a conditional denoising diffusion process.
arXiv Detail & Related papers (2023-02-21T13:37:07Z) - SUCRe: Leveraging Scene Structure for Underwater Color Restoration [1.9490160607392462]
We introduce SUCRe, a novel method that exploits the scene's 3D structure for underwater color restoration.
We conduct extensive quantitative and qualitative analyses of our approach in a variety of scenarios ranging from natural light to deep-sea environments.
arXiv Detail & Related papers (2022-12-18T16:53:13Z) - Self-calibrating Photometric Stereo by Neural Inverse Rendering [88.67603644930466]
This paper tackles the task of uncalibrated photometric stereo for 3D object reconstruction.
We propose a new method that jointly optimize object shape, light directions, and light intensities.
Our method demonstrates state-of-the-art accuracy in light estimation and shape recovery on real-world datasets.
arXiv Detail & Related papers (2022-07-16T02:46:15Z) - Hyperspectral 3D Mapping of Underwater Environments [0.7087237546722617]
We present an initial method for creating hyperspectral 3D reconstructions of underwater environments.
By fusing the data gathered by a classical RGB camera, an inertial navigation system and a hyperspectral push-broom camera, we show that the proposed method creates highly accurate 3D reconstructions with hyperspectral textures.
arXiv Detail & Related papers (2021-10-13T08:37:22Z) - Learning Indoor Inverse Rendering with 3D Spatially-Varying Lighting [149.1673041605155]
We address the problem of jointly estimating albedo, normals, depth and 3D spatially-varying lighting from a single image.
Most existing methods formulate the task as image-to-image translation, ignoring the 3D properties of the scene.
We propose a unified, learning-based inverse framework that formulates 3D spatially-varying lighting.
arXiv Detail & Related papers (2021-09-13T15:29:03Z) - Refractive Geometry for Underwater Domes [3.24029503704305]
We show how to compute the center of refraction without knowledge of exact air, glass or water properties.
We propose a pure underwater calibration procedure to estimate the decentering from multiple images.
This estimate can either be used during adjustment to guide the mechanical position of the lens, or can be considered in photogrammetric underwater applications.
arXiv Detail & Related papers (2021-08-14T16:19:11Z) - Deep 3D Capture: Geometry and Reflectance from Sparse Multi-View Images [59.906948203578544]
We introduce a novel learning-based method to reconstruct the high-quality geometry and complex, spatially-varying BRDF of an arbitrary object.
We first estimate per-view depth maps using a deep multi-view stereo network.
These depth maps are used to coarsely align the different views.
We propose a novel multi-view reflectance estimation network architecture.
arXiv Detail & Related papers (2020-03-27T21:28:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.