Shape from Polarization of Thermal Emission and Reflection
- URL: http://arxiv.org/abs/2506.18217v1
- Date: Mon, 23 Jun 2025 00:33:17 GMT
- Title: Shape from Polarization of Thermal Emission and Reflection
- Authors: Kazuma Kitazawa, Tsuyoshi Takatani,
- Abstract summary: We leverage the Shape from Polarization (SfP) technique in the Long-Wave Infrared (LWIR) spectrum, where most materials are opaque and emissive.<n>We formulated a polarization model that explicitly accounts for the combined effects of emission and reflection.<n>We implemented a prototype system and created ThermoPol, the first real-world benchmark dataset for LWIR SfP.
- Score: 2.7317088388886384
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Shape estimation for transparent objects is challenging due to their complex light transport. To circumvent these difficulties, we leverage the Shape from Polarization (SfP) technique in the Long-Wave Infrared (LWIR) spectrum, where most materials are opaque and emissive. While a few prior studies have explored LWIR SfP, these attempts suffered from significant errors due to inadequate polarimetric modeling, particularly the neglect of reflection. Addressing this gap, we formulated a polarization model that explicitly accounts for the combined effects of emission and reflection. Based on this model, we estimated surface normals using not only a direct model-based method but also a learning-based approach employing a neural network trained on a physically-grounded synthetic dataset. Furthermore, we modeled the LWIR polarimetric imaging process, accounting for inherent systematic errors to ensure accurate polarimetry. We implemented a prototype system and created ThermoPol, the first real-world benchmark dataset for LWIR SfP. Through comprehensive experiments, we demonstrated the high accuracy and broad applicability of our method across various materials, including those transparent in the visible spectrum.
Related papers
- Glossy Object Reconstruction with Cost-effective Polarized Acquisition [41.96986483856648]
This work introduces a scalable polarization-aided approach that employs cost-effective acquisition tools.<n>The proposed approach represents polarimetric BRDF, Stokes vectors, and polarization states of object surfaces as neural implicit fields.<n>By leveraging fundamental physical principles for the implicit representation of polarization rendering, our method demonstrates superiority over existing techniques.
arXiv Detail & Related papers (2025-04-09T16:38:51Z) - Generalizable Non-Line-of-Sight Imaging with Learnable Physical Priors [52.195637608631955]
Non-line-of-sight (NLOS) imaging has attracted increasing attention due to its potential applications.
Existing NLOS reconstruction approaches are constrained by the reliance on empirical physical priors.
We introduce a novel learning-based solution, comprising two key designs: Learnable Path Compensation (LPC) and Adaptive Phasor Field (APF)
arXiv Detail & Related papers (2024-09-21T04:39:45Z) - SS-SfP:Neural Inverse Rendering for Self Supervised Shape from (Mixed) Polarization [21.377923666134116]
Shape from Polarization (SfP) is the problem popularly known as Shape from Polarization (SfP)
We present a novel inverse rendering-based framework to estimate the 3D shape (per-pixel surface normals and depth) of objects and scenes from single-view polarization images.
arXiv Detail & Related papers (2024-07-12T14:29:00Z) - Deep Polarization Cues for Single-shot Shape and Subsurface Scattering Estimation [13.561603248769302]
We propose a novel learning-based method to jointly estimate the shape and subsurface scattering (SSS) parameters of translucent objects.
Our observations indicate that the SSS affects not only the light intensity but also the polarization signal.
We introduce the first large-scale synthetic dataset of polarized translucent objects for training our model.
arXiv Detail & Related papers (2024-07-11T03:00:24Z) - Robust Depth Enhancement via Polarization Prompt Fusion Tuning [112.88371907047396]
We present a framework that leverages polarization imaging to improve inaccurate depth measurements from various depth sensors.
Our method first adopts a learning-based strategy where a neural network is trained to estimate a dense and complete depth map from polarization data and a sensor depth map from different sensors.
To further improve the performance, we propose a Polarization Prompt Fusion Tuning (PPFT) strategy to effectively utilize RGB-based models pre-trained on large-scale datasets.
arXiv Detail & Related papers (2024-04-05T17:55:33Z) - NeISF: Neural Incident Stokes Field for Geometry and Material Estimation [50.588983686271284]
Multi-view inverse rendering is the problem of estimating the scene parameters such as shapes, materials, or illuminations from a sequence of images captured under different viewpoints.
We propose Neural Incident Stokes Fields (NeISF), a multi-view inverse framework that reduces ambiguities using polarization cues.
arXiv Detail & Related papers (2023-11-22T06:28:30Z) - Polarimetric Information for Multi-Modal 6D Pose Estimation of
Photometrically Challenging Objects with Limited Data [51.95347650131366]
6D pose estimation pipelines that rely on RGB-only or RGB-D data show limitations for photometrically challenging objects.
A supervised learning-based method utilising complementary polarisation information is proposed to overcome such limitations.
arXiv Detail & Related papers (2023-08-21T10:56:00Z) - NeFII: Inverse Rendering for Reflectance Decomposition with Near-Field
Indirect Illumination [48.42173911185454]
Inverse rendering methods aim to estimate geometry, materials and illumination from multi-view RGB images.
We propose an end-to-end inverse rendering pipeline that decomposes materials and illumination from multi-view images.
arXiv Detail & Related papers (2023-03-29T12:05:19Z) - Transparent Shape from a Single View Polarization Image [6.18278691318801]
This paper presents a learning-based method for transparent surface estimation from a single view polarization image.
Existing shape from polarization(SfP) methods have the difficulty in estimating transparent shape since the inherent transmission interference heavily reduces the reliability of physics-based prior.
arXiv Detail & Related papers (2022-04-13T12:24:32Z) - Uncalibrated Neural Inverse Rendering for Photometric Stereo of General
Surfaces [103.08512487830669]
This paper presents an uncalibrated deep neural network framework for the photometric stereo problem.
Existing neural network-based methods either require exact light directions or ground-truth surface normals of the object or both.
We propose an uncalibrated neural inverse rendering approach to this problem.
arXiv Detail & Related papers (2020-12-12T10:33:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.