MuS-Polar3D: A Benchmark Dataset for Computational Polarimetric 3D Imaging under Multi-Scattering Conditions
- URL: http://arxiv.org/abs/2512.21513v1
- Date: Thu, 25 Dec 2025 05:32:39 GMT
- Title: MuS-Polar3D: A Benchmark Dataset for Computational Polarimetric 3D Imaging under Multi-Scattering Conditions
- Authors: Puyun Wang, Kaimin Yu, Huayang He, Xianyu Wu,
- Abstract summary: Polarization-based underwater 3D imaging exploits polarization cues to suppress background scattering, exhibiting distinct advantages in turbid water.<n>MuS-Polar3D is the first publicly available benchmark dataset for quantitative turbidity underwater polarization-based 3D imaging.
- Score: 0.7933039558471408
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Polarization-based underwater 3D imaging exploits polarization cues to suppress background scattering, exhibiting distinct advantages in turbid water. Although data-driven polarization-based underwater 3D reconstruction methods show great potential, existing public datasets lack sufficient diversity in scattering and observation conditions, hindering fair comparisons among different approaches, including single-view and multi-view polarization imaging methods. To address this limitation, we construct MuS-Polar3D, a benchmark dataset comprising polarization images of 42 objects captured under seven quantitatively controlled scattering conditions and five viewpoints, together with high-precision 3D models (+/- 0.05 mm accuracy), normal maps, and foreground masks. The dataset supports multiple vision tasks, including normal estimation, object segmentation, descattering, and 3D reconstruction. Inspired by computational imaging, we further decouple underwater 3D reconstruction under scattering into a two-stage pipeline, namely descattering followed by 3D reconstruction, from an imaging-chain perspective. Extensive evaluations using multiple baseline methods under complex scattering conditions demonstrate the effectiveness of the proposed benchmark, achieving a best mean angular error of 15.49 degrees. To the best of our knowledge, MuS-Polar3D is the first publicly available benchmark dataset for quantitative turbidity underwater polarization-based 3D imaging, enabling accurate reconstruction and fair algorithm evaluation under controllable scattering conditions. The dataset and code are publicly available at https://github.com/WangPuyun/MuS-Polar3D.
Related papers
- UD-SfPNet: An Underwater Descattering Shape-from-Polarization Network for 3D Normal Reconstruction [3.2610672252390724]
polarization imaging offers the unique dual advantages of descattering and shape-from-polarization (SfP) 3D reconstruction.<n>This paper proposes UD-SfPNet, an underwater descattering shape-from-polarization network that leverages polarization cues for improved 3D surface normal prediction.
arXiv Detail & Related papers (2026-03-01T04:10:36Z) - PolarAnything: Diffusion-based Polarimetric Image Synthesis [59.14294818211059]
We propose PolarAnything, capable of synthesizing polarization images from a single RGB input with both photorealism and physical accuracy.<n>Experiments show that our model generates high-quality polarization images and supports downstream tasks like shape from polarization.
arXiv Detail & Related papers (2025-07-23T07:09:10Z) - PF3plat: Pose-Free Feed-Forward 3D Gaussian Splatting [54.7468067660037]
PF3plat sets a new state-of-the-art across all benchmarks, supported by comprehensive ablation studies validating our design choices.<n>Our framework capitalizes on fast speed, scalability, and high-quality 3D reconstruction and view synthesis capabilities of 3DGS.
arXiv Detail & Related papers (2024-10-29T15:28:15Z) - Object Modeling from Underwater Forward-Scan Sonar Imagery with Sea-Surface Multipath [16.057203527513632]
A key contribution, for objects imaged in the proximity of the sea surface, is to resolve the multipath artifacts due to the air-water interface.
Here, the object image formed by the direct target backscatter is almost always corrupted by the ghost and sometimes by the mirror components.
We model, localize, and discard the corrupted object region within each view, thus avoiding the distortion of recovered 3-D shape.
arXiv Detail & Related papers (2024-09-10T18:46:25Z) - GEOcc: Geometrically Enhanced 3D Occupancy Network with Implicit-Explicit Depth Fusion and Contextual Self-Supervision [49.839374549646884]
This paper presents GEOcc, a Geometric-Enhanced Occupancy network tailored for vision-only surround-view perception.<n>Our approach achieves State-Of-The-Art performance on the Occ3D-nuScenes dataset with the least image resolution needed and the most weightless image backbone.
arXiv Detail & Related papers (2024-05-17T07:31:20Z) - VFMM3D: Releasing the Potential of Image by Vision Foundation Model for Monocular 3D Object Detection [80.62052650370416]
monocular 3D object detection holds significant importance across various applications, including autonomous driving and robotics.
In this paper, we present VFMM3D, an innovative framework that leverages the capabilities of Vision Foundation Models (VFMs) to accurately transform single-view images into LiDAR point cloud representations.
arXiv Detail & Related papers (2024-04-15T03:12:12Z) - Polarimetric Multi-View Inverse Rendering [13.391866136230165]
A polarization camera has great potential for 3D reconstruction since the angle of polarization (AoP) and the degree of polarization (DoP) of reflected light are related to an object's surface normal.
We propose a novel 3D reconstruction method called Polarimetric Multi-View Inverse Rendering (Polarimetric MVIR) that effectively exploits geometric, photometric, and polarimetric cues extracted from input multi-view color-polarization images.
arXiv Detail & Related papers (2022-12-24T12:12:12Z) - OPA-3D: Occlusion-Aware Pixel-Wise Aggregation for Monocular 3D Object
Detection [51.153003057515754]
OPA-3D is a single-stage, end-to-end, Occlusion-Aware Pixel-Wise Aggregation network.
It jointly estimates dense scene depth with depth-bounding box residuals and object bounding boxes.
It outperforms state-of-the-art methods on the main Car category.
arXiv Detail & Related papers (2022-11-02T14:19:13Z) - PolarFormer: Multi-camera 3D Object Detection with Polar Transformers [93.49713023975727]
3D object detection in autonomous driving aims to reason "what" and "where" the objects of interest present in a 3D world.
Existing methods often adopt the canonical Cartesian coordinate system with perpendicular axis.
We propose a new Polar Transformer (PolarFormer) for more accurate 3D object detection in the bird's-eye-view (BEV) taking as input only multi-camera 2D images.
arXiv Detail & Related papers (2022-06-30T16:32:48Z) - Investigating Spherical Epipolar Rectification for Multi-View Stereo 3D
Reconstruction [1.0152838128195467]
We propose a spherical model for epipolar rectification to minimize distortions caused by differences in principal rays.
We show through qualitative and quantitative evaluation that the proposed approach performs better than frame-based epipolar correction.
arXiv Detail & Related papers (2022-04-08T15:50:20Z) - Deep Polarization Imaging for 3D shape and SVBRDF Acquisition [7.86578678811226]
We present a novel method for efficient acquisition of shape and spatially varying reflectance of 3D objects using polarization cues.
Unlike previous works that have exploited polarization to estimate material or object appearance under certain constraints, we lift such restrictions by coupling polarization imaging with deep learning.
We demonstrate our approach to achieve superior results compared to recent works employing deep learning in conjunction with flash illumination.
arXiv Detail & Related papers (2021-05-06T17:58:43Z) - Exploration of Whether Skylight Polarization Patterns Contain
Three-dimensional Attitude Information [2.6641834518599308]
Social spider optimization (SSO) method is proposed to estimate three angles.
The results of simulation show that the algorithm can estimate 3D attitude and the established sky model contains 3D attitude information.
arXiv Detail & Related papers (2020-11-30T12:10:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.