Long-Range Thermal 3D Perception in Low Contrast Environments
- URL: http://arxiv.org/abs/2112.05280v1
- Date: Fri, 10 Dec 2021 01:16:44 GMT
- Title: Long-Range Thermal 3D Perception in Low Contrast Environments
- Authors: Andrey Filippov, Olga Filippova
- Abstract summary: This report discusses the results of Phase I effort to prove the feasibility of dramatic improvement of the microbolometer-based Long Wave Infrared (LWIR) detectors sensitivity.
The resulting low SWaP-C thermal depth-sensing system will enable the situational awareness of Autonomous Air Vehicles for Advanced Air Mobility (AAM)
It will provide robust 3D information of the surrounding environment, including low-contrast static and moving objects, at far distances in degraded visual conditions and GPS-denied areas.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: This report discusses the results of SBIR Phase I effort to prove the
feasibility of dramatic improvement of the microbolometer-based Long Wave
Infrared (LWIR) detectors sensitivity, especially for the 3D measurements. The
resulting low SWaP-C thermal depth-sensing system will enable the situational
awareness of Autonomous Air Vehicles for Advanced Air Mobility (AAM). It will
provide robust 3D information of the surrounding environment, including
low-contrast static and moving objects, at far distances in degraded visual
conditions and GPS-denied areas. Our multi-sensor 3D perception enabled by COTS
uncooled thermal sensors mitigates major weakness of LWIR sensors - low
contrast by increasing the system sensitivity over an order of magnitude.
There were no available thermal image sets suitable for evaluating this
technology, making datasets acquisition our first goal. We discuss the design
and construction of the prototype system with sixteen 640pix x 512pix LWIR
detectors, camera calibration to subpixel resolution, capture, and process
synchronized image. The results show the 3.84x contrast increase for
intrascene-only data and an additional 5.5x - with the interscene accumulation,
reaching system noise-equivalent temperature difference (NETD) of 1.9 mK with
the 40 mK sensors.
Related papers
- Rotational Odometry using Ultra Low Resolution Thermal Cameras [1.3986052523534573]
This letter provides what is, to the best of our knowledge, a first study on the applicability of ultra-low-resolution thermal cameras for rotational odometry measurements.
Our use of an ultra-low-resolution thermal camera instead of other modalities such as an RGB camera is motivated by its robustness to lighting conditions.
Experiments and ablation studies are conducted for determining the impact of thermal camera resolution and the number of successive frames on the CNN estimation precision.
arXiv Detail & Related papers (2024-11-02T12:15:32Z) - Thermal3D-GS: Physics-induced 3D Gaussians for Thermal Infrared Novel-view Synthesis [11.793425521298488]
This paper introduces a physics-induced 3D Gaussian splatting method named Thermal3D-GS.
The first large-scale benchmark dataset for this field named Thermal Infrared Novel-view Synthesis dataset (TI-NSD) is created.
The results indicate that our method outperforms the baseline method with a 3.03 dB improvement in PSNR.
arXiv Detail & Related papers (2024-09-12T13:46:53Z) - Multi-Modal Neural Radiance Field for Monocular Dense SLAM with a
Light-Weight ToF Sensor [58.305341034419136]
We present the first dense SLAM system with a monocular camera and a light-weight ToF sensor.
We propose a multi-modal implicit scene representation that supports rendering both the signals from the RGB camera and light-weight ToF sensor.
Experiments demonstrate that our system well exploits the signals of light-weight ToF sensors and achieves competitive results.
arXiv Detail & Related papers (2023-08-28T07:56:13Z) - Vision Transformers, a new approach for high-resolution and large-scale
mapping of canopy heights [50.52704854147297]
We present a new vision transformer (ViT) model optimized with a classification (discrete) and a continuous loss function.
This model achieves better accuracy than previously used convolutional based approaches (ConvNets) optimized with only a continuous loss function.
arXiv Detail & Related papers (2023-04-22T22:39:03Z) - On the Importance of Accurate Geometry Data for Dense 3D Vision Tasks [61.74608497496841]
Training on inaccurate or corrupt data induces model bias and hampers generalisation capabilities.
This paper investigates the effect of sensor errors for the dense 3D vision tasks of depth estimation and reconstruction.
arXiv Detail & Related papers (2023-03-26T22:32:44Z) - DensePose From WiFi [86.61881052177228]
We develop a deep neural network that maps the phase and amplitude of WiFi signals to UV coordinates within 24 human regions.
Our model can estimate the dense pose of multiple subjects, with comparable performance to image-based approaches.
arXiv Detail & Related papers (2022-12-31T16:48:43Z) - Drone Detection and Tracking in Real-Time by Fusion of Different Sensing
Modalities [66.4525391417921]
We design and evaluate a multi-sensor drone detection system.
Our solution integrates a fish-eye camera as well to monitor a wider part of the sky and steer the other cameras towards objects of interest.
The thermal camera is shown to be a feasible solution as good as the video camera, even if the camera employed here has a lower resolution.
arXiv Detail & Related papers (2022-07-05T10:00:58Z) - MilliTRACE-IR: Contact Tracing and Temperature Screening via mm-Wave and
Infrared Sensing [4.6838063911731025]
milliTRACE-IR is a joint mm-wave radar and infrared imaging sensing system.
The system achieves fully automated measurement of distancing and body temperature.
A person with high body temperature is reliably detected by the thermal camera sensor and subsequently traced across a large indoor area in a non-invasive way by the radars.
arXiv Detail & Related papers (2021-10-08T08:58:36Z) - TUM-VIE: The TUM Stereo Visual-Inertial Event Dataset [50.8779574716494]
Event cameras are bio-inspired vision sensors which measure per pixel brightness changes.
They offer numerous benefits over traditional, frame-based cameras, including low latency, high dynamic range, high temporal resolution and low power consumption.
To foster the development of 3D perception and navigation algorithms with event cameras, we present the TUM-VIE dataset.
arXiv Detail & Related papers (2021-08-16T19:53:56Z) - Fusion of Real Time Thermal Image and 1D/2D/3D Depth Laser Readings for
Remote Thermal Sensing in Industrial Plants by Means of UAVs and/or Robots [0.0]
This paper presents fast procedures for thermal infrared remote sensing in dark, GPS-denied environments.
The procedures are based on the combination of the depth estimation obtained from either a 1-Dimensional LIDAR laser or a 2-Dimensional Hokuyo laser.
The combination of these sensors/cameras is suitable to be mounted on Unmanned Aerial Vehicles (UAVs) and/or robots.
arXiv Detail & Related papers (2020-06-01T21:52:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.