Fusion of Real Time Thermal Image and 1D/2D/3D Depth Laser Readings for
Remote Thermal Sensing in Industrial Plants by Means of UAVs and/or Robots
- URL: http://arxiv.org/abs/2006.01286v3
- Date: Thu, 4 Jun 2020 10:22:23 GMT
- Title: Fusion of Real Time Thermal Image and 1D/2D/3D Depth Laser Readings for
Remote Thermal Sensing in Industrial Plants by Means of UAVs and/or Robots
- Authors: Corneliu Arsene
- Abstract summary: This paper presents fast procedures for thermal infrared remote sensing in dark, GPS-denied environments.
The procedures are based on the combination of the depth estimation obtained from either a 1-Dimensional LIDAR laser or a 2-Dimensional Hokuyo laser.
The combination of these sensors/cameras is suitable to be mounted on Unmanned Aerial Vehicles (UAVs) and/or robots.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper presents fast procedures for thermal infrared remote sensing in
dark, GPS-denied environments, such as those found in industrial plants such as
in High-Voltage Direct Current (HVDC) converter stations. These procedures are
based on the combination of the depth estimation obtained from either a
1-Dimensional LIDAR laser or a 2-Dimensional Hokuyo laser or a 3D MultiSense
SLB laser sensor and the visible and thermal cameras from a FLIR Duo R
dual-sensor thermal camera. The combination of these sensors/cameras is
suitable to be mounted on Unmanned Aerial Vehicles (UAVs) and/or robots in
order to provide reliable information about the potential malfunctions, which
can be found within the hazardous environment. For example, the capabilities of
the developed software and hardware system corresponding to the combination of
the 1-D LIDAR sensor and the FLIR Duo R dual-sensor thermal camera is assessed
from the point of the accuracy of results and the required computational times:
the obtained computational times are under 10 ms, with a maximum localization
error of 8 mm and an average standard deviation for the measured temperatures
of 1.11 degree Celsius, which results are obtained for a number of test cases.
The paper is structured as follows: the description of the system used for
identification and localization of hotspots in industrial plants is presented
in section II. In section III, the method for faults identification and
localization in plants by using a 1-Dimensional LIDAR laser sensor and thermal
images is described together with results. In section IV the real time thermal
image processing is presented. Fusion of the 2-Dimensional depth laser Hokuyo
and the thermal images is described in section V. In section VI the combination
of the 3D MultiSense SLB laser and thermal images is described. In section VII
a discussion and several conclusions are drawn.
Related papers
- In-Situ Infrared Camera Monitoring for Defect and Anomaly Detection in Laser Powder Bed Fusion: Calibration, Data Mapping, and Feature Extraction [0.26999000177990923]
Laser powder bed fusion (LPBF) process can incur defects due to melt pool instabilities, spattering, temperature increase, and powder spread anomalies.
Identifying defects through in-situ monitoring typically requires collecting, storing, and analyzing large amounts of data generated.
arXiv Detail & Related papers (2024-07-17T16:02:22Z) - Multi-Modal Neural Radiance Field for Monocular Dense SLAM with a
Light-Weight ToF Sensor [58.305341034419136]
We present the first dense SLAM system with a monocular camera and a light-weight ToF sensor.
We propose a multi-modal implicit scene representation that supports rendering both the signals from the RGB camera and light-weight ToF sensor.
Experiments demonstrate that our system well exploits the signals of light-weight ToF sensors and achieves competitive results.
arXiv Detail & Related papers (2023-08-28T07:56:13Z) - Photometric Correction for Infrared Sensors [1.170732359523702]
This article proposes a photometric correction model for infrared sensors based on temperature constancy.
Experiments show that the reconstruction quality from the corrected infrared imagery achieves performance on par with state-of-the-art reconstruction using RGB sensors.
arXiv Detail & Related papers (2023-04-08T06:32:57Z) - Learning Online Multi-Sensor Depth Fusion [100.84519175539378]
SenFuNet is a depth fusion approach that learns sensor-specific noise and outlier statistics.
We conduct experiments with various sensor combinations on the real-world CoRBS and Scene3D datasets.
arXiv Detail & Related papers (2022-04-07T10:45:32Z) - Long-Range Thermal 3D Perception in Low Contrast Environments [0.0]
This report discusses the results of Phase I effort to prove the feasibility of dramatic improvement of the microbolometer-based Long Wave Infrared (LWIR) detectors sensitivity.
The resulting low SWaP-C thermal depth-sensing system will enable the situational awareness of Autonomous Air Vehicles for Advanced Air Mobility (AAM)
It will provide robust 3D information of the surrounding environment, including low-contrast static and moving objects, at far distances in degraded visual conditions and GPS-denied areas.
arXiv Detail & Related papers (2021-12-10T01:16:44Z) - MilliTRACE-IR: Contact Tracing and Temperature Screening via mm-Wave and
Infrared Sensing [4.6838063911731025]
milliTRACE-IR is a joint mm-wave radar and infrared imaging sensing system.
The system achieves fully automated measurement of distancing and body temperature.
A person with high body temperature is reliably detected by the thermal camera sensor and subsequently traced across a large indoor area in a non-invasive way by the radars.
arXiv Detail & Related papers (2021-10-08T08:58:36Z) - LIF-Seg: LiDAR and Camera Image Fusion for 3D LiDAR Semantic
Segmentation [78.74202673902303]
We propose a coarse-tofine LiDAR and camera fusion-based network (termed as LIF-Seg) for LiDAR segmentation.
The proposed method fully utilizes the contextual information of images and introduces a simple but effective early-fusion strategy.
The cooperation of these two components leads to the success of the effective camera-LiDAR fusion.
arXiv Detail & Related papers (2021-08-17T08:53:11Z) - EPMF: Efficient Perception-aware Multi-sensor Fusion for 3D Semantic Segmentation [62.210091681352914]
We study multi-sensor fusion for 3D semantic segmentation for many applications, such as autonomous driving and robotics.
In this work, we investigate a collaborative fusion scheme called perception-aware multi-sensor fusion (PMF)
We propose a two-stream network to extract features from the two modalities separately. The extracted features are fused by effective residual-based fusion modules.
arXiv Detail & Related papers (2021-06-21T10:47:26Z) - Online Photometric Calibration of Automatic Gain Thermal Infrared
Cameras [0.0]
We introduce an algorithm for online photometric calibration of thermal-infrared cameras.
Our proposed method does not require any specific driver/ hardware support.
We present this in the context of visual odometry and SLAM algorithms.
arXiv Detail & Related papers (2020-12-07T17:51:54Z) - Towards Online Monitoring and Data-driven Control: A Study of
Segmentation Algorithms for Laser Powder Bed Fusion Processes [83.97264034062673]
An increasing number of laser powder bed fusion machines use off-axis infrared cameras to improve online monitoring and data-driven control capabilities.
We study over 30 segmentation algorithms that segment each infrared image into a foreground and background.
The identified algorithms can be readily applied to the laser powder bed fusion machines to address each of the above limitations and thus, significantly improve process control.
arXiv Detail & Related papers (2020-11-18T03:30:16Z) - Exploring Thermal Images for Object Detection in Underexposure Regions
for Autonomous Driving [67.69430435482127]
Underexposure regions are vital to construct a complete perception of the surroundings for safe autonomous driving.
The availability of thermal cameras has provided an essential alternate to explore regions where other optical sensors lack in capturing interpretable signals.
This work proposes a domain adaptation framework which employs a style transfer technique for transfer learning from visible spectrum images to thermal images.
arXiv Detail & Related papers (2020-06-01T09:59:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.