Real-time Human Detection in Fire Scenarios using Infrared and Thermal
Imaging Fusion
- URL: http://arxiv.org/abs/2307.04223v1
- Date: Sun, 9 Jul 2023 16:28:57 GMT
- Title: Real-time Human Detection in Fire Scenarios using Infrared and Thermal
Imaging Fusion
- Authors: Truong-Dong Do, Nghe-Nhan Truong and My-Ha Le
- Abstract summary: Fire is considered one of the most serious threats to human lives which results in a high probability of fatalities.
The use of a vision-based human detection system is able to improve the ability to save more lives.
This paper proposes a thermal and infrared imaging fusion strategy based on multiple cameras for human detection in low-visibility scenarios caused by smoke.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Fire is considered one of the most serious threats to human lives which
results in a high probability of fatalities. Those severe consequences stem
from the heavy smoke emitted from a fire that mostly restricts the visibility
of escaping victims and rescuing squad. In such hazardous circumstances, the
use of a vision-based human detection system is able to improve the ability to
save more lives. To this end, a thermal and infrared imaging fusion strategy
based on multiple cameras for human detection in low-visibility scenarios
caused by smoke is proposed in this paper. By processing with multiple cameras,
vital information can be gathered to generate more useful features for human
detection. Firstly, the cameras are calibrated using a Light Heating
Chessboard. Afterward, the features extracted from the input images are merged
prior to being passed through a lightweight deep neural network to perform the
human detection task. The experiments conducted on an NVIDIA Jetson Nano
computer demonstrated that the proposed method can process with reasonable
speed and can achieve favorable performance with a mAP@0.5 of 95%.
Related papers
- MISFIT-V: Misaligned Image Synthesis and Fusion using Information from
Thermal and Visual [2.812395851874055]
This work presents Misaligned Image Synthesis and Fusion using Information from Thermal and Visual (MISFIT-V)
It is a novel two-pronged unsupervised deep learning approach that utilizes a Generative Adversarial Network (GAN) and a cross-attention mechanism to capture the most relevant features from each modality.
Experimental results show MISFIT-V offers enhanced robustness against misalignment and poor lighting/thermal environmental conditions.
arXiv Detail & Related papers (2023-09-22T23:41:24Z) - Robust Human Detection under Visual Degradation via Thermal and mmWave
Radar Fusion [4.178845249771262]
We present a multimodal human detection system that combines portable thermal cameras and single-chip mmWave radars.
We propose a Bayesian feature extractor and a novel uncertainty-guided fusion method that surpasses a variety of competing methods.
We evaluate the proposed method on real-world data collection and demonstrate that our approach outperforms the state-of-the-art methods by a large margin.
arXiv Detail & Related papers (2023-07-07T14:23:20Z) - ScatterNeRF: Seeing Through Fog with Physically-Based Inverse Neural
Rendering [83.75284107397003]
We introduce ScatterNeRF, a neural rendering method which renders scenes and decomposes the fog-free background.
We propose a disentangled representation for the scattering volume and the scene objects, and learn the scene reconstruction with physics-inspired losses.
We validate our method by capturing multi-view In-the-Wild data and controlled captures in a large-scale fog chamber.
arXiv Detail & Related papers (2023-05-03T13:24:06Z) - Adversarial Infrared Blocks: A Multi-view Black-box Attack to Thermal
Infrared Detectors in Physical World [4.504479592538401]
We propose a novel physical attack called adversarial infrared blocks (AdvIB)
By optimizing the physical parameters of the adversarial infrared blocks, this method can execute a stealthy black-box attack on thermal imaging system from various angles.
For stealthiness, our method involves attaching the adversarial infrared block to the inside of clothing, enhancing its stealthiness.
arXiv Detail & Related papers (2023-04-21T02:53:56Z) - DensePose From WiFi [86.61881052177228]
We develop a deep neural network that maps the phase and amplitude of WiFi signals to UV coordinates within 24 human regions.
Our model can estimate the dense pose of multiple subjects, with comparable performance to image-based approaches.
arXiv Detail & Related papers (2022-12-31T16:48:43Z) - Image-Based Fire Detection in Industrial Environments with YOLOv4 [53.180678723280145]
This work looks into the potential of AI to detect and recognize fires and reduce detection time using object detection on an image stream.
To our end, we collected and labeled appropriate data from several public sources, which have been used to train and evaluate several models based on the popular YOLOv4 object detector.
arXiv Detail & Related papers (2022-12-09T11:32:36Z) - Privacy-Preserving Person Detection Using Low-Resolution Infrared
Cameras [9.801893730708134]
In intelligent building management, knowing the number of people and their location in a room are important for better control of its illumination, ventilation, and heating with reduced costs and improved comfort.
This is typically achieved by detecting people using embedded devices that are installed on the room's ceiling, and that integrate low-resolution infrared camera, which conceals each person's identity.
For accurate detection, state-of-the-art deep learning models still require supervised training using a large annotated dataset of images.
In this paper, we investigate cost-effective methods that are suitable for person detection based on low-resolution infrared images
arXiv Detail & Related papers (2022-09-22T22:20:30Z) - Drone Detection and Tracking in Real-Time by Fusion of Different Sensing
Modalities [66.4525391417921]
We design and evaluate a multi-sensor drone detection system.
Our solution integrates a fish-eye camera as well to monitor a wider part of the sky and steer the other cameras towards objects of interest.
The thermal camera is shown to be a feasible solution as good as the video camera, even if the camera employed here has a lower resolution.
arXiv Detail & Related papers (2022-07-05T10:00:58Z) - Computer Vision-based Characterization of Large-scale Jet Flames using a
Synthetic Infrared Image Generation Approach [0.8431877864777444]
This paper proposes the use of Generative Adversarial Networks to produce plausible infrared images from visible ones.
Results suggest that it is possible to realistically replicate the results for experiments carried out using both visible and infrared cameras.
arXiv Detail & Related papers (2022-06-05T06:54:36Z) - In-Bed Person Monitoring Using Thermal Infrared Sensors [53.561797148529664]
We use 'Griddy', a prototype with a Panasonic Grid-EYE, a low-resolution infrared thermopile array sensor, which offers more privacy.
For this purpose, two datasets were captured, one (480 images) under constant conditions, and a second one (200 images) under different variations.
We test three machine learning algorithms: Support Vector Machines (SVM), k-Nearest Neighbors (k-NN) and Neural Network (NN)
arXiv Detail & Related papers (2021-07-16T15:59:07Z) - Exploring Thermal Images for Object Detection in Underexposure Regions
for Autonomous Driving [67.69430435482127]
Underexposure regions are vital to construct a complete perception of the surroundings for safe autonomous driving.
The availability of thermal cameras has provided an essential alternate to explore regions where other optical sensors lack in capturing interpretable signals.
This work proposes a domain adaptation framework which employs a style transfer technique for transfer learning from visible spectrum images to thermal images.
arXiv Detail & Related papers (2020-06-01T09:59:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.