TeX-NeRF: Neural Radiance Fields from Pseudo-TeX Vision
- URL: http://arxiv.org/abs/2410.04873v1
- Date: Mon, 7 Oct 2024 09:43:28 GMT
- Title: TeX-NeRF: Neural Radiance Fields from Pseudo-TeX Vision
- Authors: Chonghao Zhong, Chao Xu,
- Abstract summary: We propose Ne-RF, a 3D reconstruction method using only infrared images.
We map the temperatures, emissivities (e), and textures (X) of the scene into the saturation (S), hue (H), and value (V) channels of the color space.
Novel view synthesis using the processed images has yielded excellent results.
- Score: 5.77388464529179
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural radiance fields (NeRF) has gained significant attention for its exceptional visual effects. However, most existing NeRF methods reconstruct 3D scenes from RGB images captured by visible light cameras. In practical scenarios like darkness, low light, or bad weather, visible light cameras become ineffective. Therefore, we propose TeX-NeRF, a 3D reconstruction method using only infrared images, which introduces the object material emissivity as a priori, preprocesses the infrared images using Pseudo-TeX vision, and maps the temperatures (T), emissivities (e), and textures (X) of the scene into the saturation (S), hue (H), and value (V) channels of the HSV color space, respectively. Novel view synthesis using the processed images has yielded excellent results. Additionally, we introduce 3D-TeX Datasets, the first dataset comprising infrared images and their corresponding Pseudo-TeX vision images. Experiments demonstrate that our method not only matches the quality of scene reconstruction achieved with high-quality RGB images but also provides accurate temperature estimations for objects in the scene.
Related papers
- Thermal3D-GS: Physics-induced 3D Gaussians for Thermal Infrared Novel-view Synthesis [11.793425521298488]
This paper introduces a physics-induced 3D Gaussian splatting method named Thermal3D-GS.
The first large-scale benchmark dataset for this field named Thermal Infrared Novel-view Synthesis dataset (TI-NSD) is created.
The results indicate that our method outperforms the baseline method with a 3.03 dB improvement in PSNR.
arXiv Detail & Related papers (2024-09-12T13:46:53Z) - ThermalNeRF: Thermal Radiance Fields [32.881758519242155]
We propose a unified framework for scene reconstruction from a set of LWIR and RGB images.
We calibrate the RGB and infrared cameras with respect to each other, as a preprocessing step.
We show that our method is capable of thermal super-resolution, as well as visually removing obstacles to reveal objects occluded in either the RGB or thermal channels.
arXiv Detail & Related papers (2024-07-22T02:51:29Z) - Mesh2NeRF: Direct Mesh Supervision for Neural Radiance Field Representation and Generation [51.346733271166926]
Mesh2NeRF is an approach to derive ground-truth radiance fields from textured meshes for 3D generation tasks.
We validate the effectiveness of Mesh2NeRF across various tasks.
arXiv Detail & Related papers (2024-03-28T11:22:53Z) - Leveraging Thermal Modality to Enhance Reconstruction in Low-Light Conditions [25.14690752484963]
Neural Radiance Fields (NeRF) accomplishes photo-realistic novel view synthesis by learning the implicit representation of a scene from multi-view images.
Existing approaches reconstruct low-light scenes from raw images but struggle to recover texture and boundary details in dark regions.
We present Thermal-NeRF, which takes thermal and visible raw images as inputs, to accomplish visible and thermal view synthesis simultaneously.
arXiv Detail & Related papers (2024-03-21T00:35:31Z) - ThermoNeRF: Multimodal Neural Radiance Fields for Thermal Novel View Synthesis [5.66229031510643]
We propose ThermoNeRF, a novel approach to rendering new RGB and thermal views of a scene jointly.
To overcome the lack of texture in thermal images, we use paired RGB and thermal images to learn scene density.
We also introduce ThermoScenes, a new dataset to palliate the lack of available RGB+thermal datasets for scene reconstruction.
arXiv Detail & Related papers (2024-03-18T18:10:34Z) - Thermal-NeRF: Neural Radiance Fields from an Infrared Camera [29.58060552299745]
We introduce Thermal-NeRF, the first method that estimates a volumetric scene representation in the form of a NeRF solely from IR imaging.
We conduct extensive experiments to demonstrate that Thermal-NeRF can achieve superior quality compared to existing methods.
arXiv Detail & Related papers (2024-03-15T14:27:15Z) - PERF: Panoramic Neural Radiance Field from a Single Panorama [109.31072618058043]
PERF is a novel view synthesis framework that trains a panoramic neural radiance field from a single panorama.
We propose a novel collaborative RGBD inpainting method and a progressive inpainting-and-erasing method to lift up a 360-degree 2D scene to a 3D scene.
Our PERF can be widely used for real-world applications, such as panorama-to-3D, text-to-3D, and 3D scene stylization applications.
arXiv Detail & Related papers (2023-10-25T17:59:01Z) - PDRF: Progressively Deblurring Radiance Field for Fast and Robust Scene
Reconstruction from Blurry Images [75.87721926918874]
We present Progressively Deblurring Radiance Field (PDRF)
PDRF is a novel approach to efficiently reconstruct high quality radiance fields from blurry images.
We show that PDRF is 15X faster than previous State-of-The-Art scene reconstruction methods.
arXiv Detail & Related papers (2022-08-17T03:42:29Z) - Urban Radiance Fields [77.43604458481637]
We perform 3D reconstruction and novel view synthesis from data captured by scanning platforms commonly deployed for world mapping in urban outdoor environments.
Our approach extends Neural Radiance Fields, which has been demonstrated to synthesize realistic novel images for small scenes in controlled settings.
Each of these three extensions provides significant performance improvements in experiments on Street View data.
arXiv Detail & Related papers (2021-11-29T15:58:16Z) - iNeRF: Inverting Neural Radiance Fields for Pose Estimation [68.91325516370013]
We present iNeRF, a framework that performs mesh-free pose estimation by "inverting" a Neural RadianceField (NeRF)
NeRFs have been shown to be remarkably effective for the task of view synthesis.
arXiv Detail & Related papers (2020-12-10T18:36:40Z) - NeRF++: Analyzing and Improving Neural Radiance Fields [117.73411181186088]
Neural Radiance Fields (NeRF) achieve impressive view synthesis results for a variety of capture settings.
NeRF fits multi-layer perceptrons representing view-invariant opacity and view-dependent color volumes to a set of training images.
We address a parametrization issue involved in applying NeRF to 360 captures of objects within large-scale, 3D scenes.
arXiv Detail & Related papers (2020-10-15T03:24:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.