Design of Sensor Fusion Driver Assistance System for Active Pedestrian
Safety
- URL: http://arxiv.org/abs/2201.09208v1
- Date: Sun, 23 Jan 2022 08:52:32 GMT
- Title: Design of Sensor Fusion Driver Assistance System for Active Pedestrian
Safety
- Authors: I-Hsi Kao, Ya-Zhu Yian, Jian-An Su, Yi-Horng Lai, Jau-Woei Perng,
Tung-Li Hsieh, Yi-Shueh Tsai, and Min-Shiu Hsieh
- Abstract summary: We present a sensor fusion detection system that combines a camera and 1D light detection and ranging (lidar) sensor for object detection.
The proposed system achieves a high level of accuracy for pedestrian or object detection in front of a vehicle, and has high robustness to special environments.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we present a parallel architecture for a sensor fusion
detection system that combines a camera and 1D light detection and ranging
(lidar) sensor for object detection. The system contains two object detection
methods, one based on an optical flow, and the other using lidar. The two
sensors can effectively complement the defects of the other. The accurate
longitudinal accuracy of the object's location and its lateral movement
information can be achieved simultaneously. Using a spatio-temporal alignment
and a policy of sensor fusion, we completed the development of a fusion
detection system with high reliability at distances of up to 20 m. Test results
show that the proposed system achieves a high level of accuracy for pedestrian
or object detection in front of a vehicle, and has high robustness to special
environments.
Related papers
- Automatic Spatial Calibration of Near-Field MIMO Radar With Respect to Optical Depth Sensors [4.328226032204419]
We propose a novel, joint calibration approach for optical RGB-D sensors and MIMO radars that is designed to operate in the radar's near-field range.
Our pipeline consists of a bespoke calibration target, allowing for automatic target detection and localization.
We validate our approach using two different depth sensing technologies from the optical domain.
arXiv Detail & Related papers (2024-03-16T17:24:46Z) - Multi-Modal Neural Radiance Field for Monocular Dense SLAM with a
Light-Weight ToF Sensor [58.305341034419136]
We present the first dense SLAM system with a monocular camera and a light-weight ToF sensor.
We propose a multi-modal implicit scene representation that supports rendering both the signals from the RGB camera and light-weight ToF sensor.
Experiments demonstrate that our system well exploits the signals of light-weight ToF sensors and achieves competitive results.
arXiv Detail & Related papers (2023-08-28T07:56:13Z) - Vision Guided MIMO Radar Beamforming for Enhanced Vital Signs Detection
in Crowds [26.129503530877006]
We develop a novel dual-sensing system, in which a vision sensor is leveraged to guide digital beamforming in a radar.
The calibrated dual system achieves about two centimeters precision in three-dimensional space within a field of view of $75circ$ by $65circ$ and for a range of two meters.
arXiv Detail & Related papers (2023-06-18T10:09:16Z) - Drone Detection and Tracking in Real-Time by Fusion of Different Sensing
Modalities [66.4525391417921]
We design and evaluate a multi-sensor drone detection system.
Our solution integrates a fish-eye camera as well to monitor a wider part of the sky and steer the other cameras towards objects of interest.
The thermal camera is shown to be a feasible solution as good as the video camera, even if the camera employed here has a lower resolution.
arXiv Detail & Related papers (2022-07-05T10:00:58Z) - Target-aware Dual Adversarial Learning and a Multi-scenario
Multi-Modality Benchmark to Fuse Infrared and Visible for Object Detection [65.30079184700755]
This study addresses the issue of fusing infrared and visible images that appear differently for object detection.
Previous approaches discover commons underlying the two modalities and fuse upon the common space either by iterative optimization or deep networks.
This paper proposes a bilevel optimization formulation for the joint problem of fusion and detection, and then unrolls to a target-aware Dual Adversarial Learning (TarDAL) network for fusion and a commonly used detection network.
arXiv Detail & Related papers (2022-03-30T11:44:56Z) - Detecting and Identifying Optical Signal Attacks on Autonomous Driving
Systems [25.32946739108013]
We propose a framework to detect and identify sensors that are under attack.
Specifically, we first develop a new technique to detect attacks on a system that consists of three sensors.
In our study, we use real data sets and the state-of-the-art machine learning model to evaluate our attack detection scheme.
arXiv Detail & Related papers (2021-10-20T12:21:04Z) - On the Role of Sensor Fusion for Object Detection in Future Vehicular
Networks [25.838878314196375]
We evaluate how using a combination of different sensors affects the detection of the environment in which the vehicles move and operate.
The final objective is to identify the optimal setup that would minimize the amount of data to be distributed over the channel.
arXiv Detail & Related papers (2021-04-23T18:58:37Z) - GEM: Glare or Gloom, I Can Still See You -- End-to-End Multimodal Object
Detector [11.161639542268015]
We propose sensor-aware multi-modal fusion strategies for 2D object detection in harsh-lighting conditions.
Our network learns to estimate the measurement reliability of each sensor modality in the form of scalar weights and masks.
We show that the proposed strategies out-perform the existing state-of-the-art methods on the FLIR-Thermal dataset.
arXiv Detail & Related papers (2021-02-24T14:56:37Z) - Automatic Extrinsic Calibration Method for LiDAR and Camera Sensor
Setups [68.8204255655161]
We present a method to calibrate the parameters of any pair of sensors involving LiDARs, monocular or stereo cameras.
The proposed approach can handle devices with very different resolutions and poses, as usually found in vehicle setups.
arXiv Detail & Related papers (2021-01-12T12:02:26Z) - Towards robust sensing for Autonomous Vehicles: An adversarial
perspective [82.83630604517249]
It is of primary importance that the resulting decisions are robust to perturbations.
Adversarial perturbations are purposefully crafted alterations of the environment or of the sensory measurements.
A careful evaluation of the vulnerabilities of their sensing system(s) is necessary in order to build and deploy safer systems.
arXiv Detail & Related papers (2020-07-14T05:25:15Z) - Learning Selective Sensor Fusion for States Estimation [47.76590539558037]
We propose SelectFusion, an end-to-end selective sensor fusion module.
During prediction, the network is able to assess the reliability of the latent features from different sensor modalities.
We extensively evaluate all fusion strategies in both public datasets and on progressively degraded datasets.
arXiv Detail & Related papers (2019-12-30T20:25:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.