ThermoNeRF: Multimodal Neural Radiance Fields for Thermal Novel View Synthesis
- URL: http://arxiv.org/abs/2403.12154v1
- Date: Mon, 18 Mar 2024 18:10:34 GMT
- Title: ThermoNeRF: Multimodal Neural Radiance Fields for Thermal Novel View Synthesis
- Authors: Mariam Hassan, Florent Forest, Olga Fink, Malcolm Mielle,
- Abstract summary: We propose ThermoNeRF, a novel approach to rendering new RGB and thermal views of a scene jointly.
To overcome the lack of texture in thermal images, we use paired RGB and thermal images to learn scene density.
We also introduce ThermoScenes, a new dataset to palliate the lack of available RGB+thermal datasets for scene reconstruction.
- Score: 5.66229031510643
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Thermal scene reconstruction exhibit great potential for ap- plications across a broad spectrum of fields, including building energy consumption analysis and non-destructive testing. However, existing meth- ods typically require dense scene measurements and often rely on RGB images for 3D geometry reconstruction, with thermal information being projected post-reconstruction. This two-step strategy, adopted due to the lack of texture in thermal images, can lead to disparities between the geometry and temperatures of the reconstructed objects and those of the actual scene. To address this challenge, we propose ThermoNeRF, a novel multimodal approach based on Neural Radiance Fields, capable of rendering new RGB and thermal views of a scene jointly. To overcome the lack of texture in thermal images, we use paired RGB and thermal images to learn scene density, while distinct networks estimate color and temperature information. Furthermore, we introduce ThermoScenes, a new dataset to palliate the lack of available RGB+thermal datasets for scene reconstruction. Experimental results validate that ThermoNeRF achieves accurate thermal image synthesis, with an average mean ab- solute error of 1.5{\deg}C, an improvement of over 50% compared to using concatenated RGB+thermal data with Nerfacto, a state-of-the-art NeRF method.
Related papers
- Thermal3D-GS: Physics-induced 3D Gaussians for Thermal Infrared Novel-view Synthesis [11.793425521298488]
This paper introduces a physics-induced 3D Gaussian splatting method named Thermal3D-GS.
The first large-scale benchmark dataset for this field named Thermal Infrared Novel-view Synthesis dataset (TI-NSD) is created.
The results indicate that our method outperforms the baseline method with a 3.03 dB improvement in PSNR.
arXiv Detail & Related papers (2024-09-12T13:46:53Z) - ThermalGaussian: Thermal 3D Gaussian Splatting [25.536611434289647]
We propose ThermalGaussian, the first thermal 3DGS approach capable of rendering high-quality images in RGB and thermal modalities.
We conduct comprehensive experiments to show that ThermalGaussian achieves photorealistic rendering of thermal images and improves the rendering quality of RGB images.
arXiv Detail & Related papers (2024-09-11T11:45:57Z) - ThermalNeRF: Thermal Radiance Fields [32.881758519242155]
We propose a unified framework for scene reconstruction from a set of LWIR and RGB images.
We calibrate the RGB and infrared cameras with respect to each other, as a preprocessing step.
We show that our method is capable of thermal super-resolution, as well as visually removing obstacles to reveal objects occluded in either the RGB or thermal channels.
arXiv Detail & Related papers (2024-07-22T02:51:29Z) - Taming Latent Diffusion Model for Neural Radiance Field Inpainting [63.297262813285265]
Neural Radiance Field (NeRF) is a representation for 3D reconstruction from multi-view images.
We propose tempering the diffusion model'sity with per-scene customization and mitigating the textural shift with masked training.
Our framework yields state-of-the-art NeRF inpainting results on various real-world scenes.
arXiv Detail & Related papers (2024-04-15T17:59:57Z) - Leveraging Thermal Modality to Enhance Reconstruction in Low-Light Conditions [25.14690752484963]
Neural Radiance Fields (NeRF) accomplishes photo-realistic novel view synthesis by learning the implicit representation of a scene from multi-view images.
Existing approaches reconstruct low-light scenes from raw images but struggle to recover texture and boundary details in dark regions.
We present Thermal-NeRF, which takes thermal and visible raw images as inputs, to accomplish visible and thermal view synthesis simultaneously.
arXiv Detail & Related papers (2024-03-21T00:35:31Z) - Thermal-NeRF: Neural Radiance Fields from an Infrared Camera [29.58060552299745]
We introduce Thermal-NeRF, the first method that estimates a volumetric scene representation in the form of a NeRF solely from IR imaging.
We conduct extensive experiments to demonstrate that Thermal-NeRF can achieve superior quality compared to existing methods.
arXiv Detail & Related papers (2024-03-15T14:27:15Z) - Pano-NeRF: Synthesizing High Dynamic Range Novel Views with Geometry
from Sparse Low Dynamic Range Panoramic Images [82.1477261107279]
We propose the irradiance fields from sparse LDR panoramic images to increase the observation counts for faithful geometry recovery.
Experiments demonstrate that the irradiance fields outperform state-of-the-art methods on both geometry recovery and HDR reconstruction.
arXiv Detail & Related papers (2023-12-26T08:10:22Z) - Does Thermal Really Always Matter for RGB-T Salient Object Detection? [153.17156598262656]
This paper proposes a network named TNet to solve the RGB-T salient object detection (SOD) task.
In this paper, we introduce a global illumination estimation module to predict the global illuminance score of the image.
On the other hand, we introduce a two-stage localization and complementation module in the decoding phase to transfer object localization cue and internal integrity cue in thermal features to the RGB modality.
arXiv Detail & Related papers (2022-10-09T13:50:12Z) - Enhancement of Novel View Synthesis Using Omnidirectional Image
Completion [61.78187618370681]
We present a method for synthesizing novel views from a single 360-degree RGB-D image based on the neural radiance field (NeRF)
Experiments demonstrated that the proposed method can synthesize plausible novel views while preserving the features of the scene for both artificial and real-world data.
arXiv Detail & Related papers (2022-03-18T13:49:25Z) - A Large-Scale, Time-Synchronized Visible and Thermal Face Dataset [62.193924313292875]
We present the DEVCOM Army Research Laboratory Visible-Thermal Face dataset (ARL-VTF)
With over 500,000 images from 395 subjects, the ARL-VTF dataset represents to the best of our knowledge, the largest collection of paired visible and thermal face images to date.
This paper presents benchmark results and analysis on thermal face landmark detection and thermal-to-visible face verification by evaluating state-of-the-art models on the ARL-VTF dataset.
arXiv Detail & Related papers (2021-01-07T17:17:12Z) - iNeRF: Inverting Neural Radiance Fields for Pose Estimation [68.91325516370013]
We present iNeRF, a framework that performs mesh-free pose estimation by "inverting" a Neural RadianceField (NeRF)
NeRFs have been shown to be remarkably effective for the task of view synthesis.
arXiv Detail & Related papers (2020-12-10T18:36:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.