Snowy Scenes,Clear Detections: A Robust Model for Traffic Light Detection in Adverse Weather Conditions
- URL: http://arxiv.org/abs/2406.13473v1
- Date: Wed, 19 Jun 2024 11:52:12 GMT
- Title: Snowy Scenes,Clear Detections: A Robust Model for Traffic Light Detection in Adverse Weather Conditions
- Authors: Shivank Garg, Abhishek Baghel, Amit Agarwal, Durga Toshniwal,
- Abstract summary: Adverse weather presents major challenges for current detection systems, often resulting in failures and potential safety risks.
This paper introduces a novel framework and pipeline designed to improve object detection under such conditions.
Results show a 40.8% improvement in average IoU and F1 scores compared to naive fine-tuning.
- Score: 5.208045772970408
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: With the rise of autonomous vehicles and advanced driver-assistance systems (ADAS), ensuring reliable object detection in all weather conditions is crucial for safety and efficiency. Adverse weather like snow, rain, and fog presents major challenges for current detection systems, often resulting in failures and potential safety risks. This paper introduces a novel framework and pipeline designed to improve object detection under such conditions, focusing on traffic signal detection where traditional methods often fail due to domain shifts caused by adverse weather. We provide a comprehensive analysis of the limitations of existing techniques. Our proposed pipeline significantly enhances detection accuracy in snow, rain, and fog. Results show a 40.8% improvement in average IoU and F1 scores compared to naive fine-tuning and a 22.4% performance increase in domain shift scenarios, such as training on artificial snow and testing on rain images.
Related papers
- Robust ADAS: Enhancing Robustness of Machine Learning-based Advanced Driver Assistance Systems for Adverse Weather [5.383130566626935]
This paper employs a Denoising Deep Neural Network as a preprocessing step to transform adverse weather images into clear weather images.
It improves driver visualization, which is critical for safe navigation in adverse weather conditions.
arXiv Detail & Related papers (2024-07-02T18:03:52Z) - Predicting the Influence of Adverse Weather on Pedestrian Detection with Automotive Radar and Lidar Sensors [2.4903631775244213]
Pedestrians are among the most endangered traffic participants in road traffic.
While pedestrian detection in nominal conditions is well established, the sensor and, therefore, the pedestrian detection performance degrades under adverse weather conditions.
We introduce a dedicated textitWeather Filter (WF) model that predicts the effects of rain and fog on a user-specified radar and lidar on pedestrian detection performance.
arXiv Detail & Related papers (2024-05-21T12:44:43Z) - Genuine Knowledge from Practice: Diffusion Test-Time Adaptation for
Video Adverse Weather Removal [53.15046196592023]
We introduce test-time adaptation into adverse weather removal in videos.
We propose the first framework that integrates test-time adaptation into the iterative diffusion reverse process.
arXiv Detail & Related papers (2024-03-12T14:21:30Z) - Challenges of YOLO Series for Object Detection in Extremely Heavy Rain:
CALRA Simulator based Synthetic Evaluation Dataset [0.0]
Object detection by diverse sensors (e.g., LiDAR, radar, and camera) should be prioritized for autonomous vehicles.
These sensors require to detect objects accurately and quickly in diverse weather conditions, but they tend to have challenges to consistently detect objects in bad weather conditions with rain, snow, or fog.
In this study, based on experimentally obtained raindrop data from precipitation conditions, we constructed a novel dataset that could test diverse network model in various precipitation conditions.
arXiv Detail & Related papers (2023-12-13T08:45:57Z) - DARTH: Holistic Test-time Adaptation for Multiple Object Tracking [87.72019733473562]
Multiple object tracking (MOT) is a fundamental component of perception systems for autonomous driving.
Despite the urge of safety in driving systems, no solution to the MOT adaptation problem to domain shift in test-time conditions has ever been proposed.
We introduce DARTH, a holistic test-time adaptation framework for MOT.
arXiv Detail & Related papers (2023-10-03T10:10:42Z) - DADFNet: Dual Attention and Dual Frequency-Guided Dehazing Network for
Video-Empowered Intelligent Transportation [79.18450119567315]
Adverse weather conditions pose severe challenges for video-based transportation surveillance.
We propose a dual attention and dual frequency-guided dehazing network (termed DADFNet) for real-time visibility enhancement.
arXiv Detail & Related papers (2023-04-19T11:55:30Z) - Ranking-Based Physics-Informed Line Failure Detection in Power Grids [66.0797334582536]
Real-time and accurate detecting of potential line failures is the first step to mitigating the extreme weather impact and activating emergency controls.
Power balance equations nonlinearity, increased uncertainty in generation during extreme events, and lack of grid observability compromise the efficiency of traditional data-driven failure detection methods.
This paper proposes a Physics-InformEd Line failure Detector (FIELD) that leverages grid topology information to reduce sample and time complexities and improve localization accuracy.
arXiv Detail & Related papers (2022-08-31T18:19:25Z) - Vision in adverse weather: Augmentation using CycleGANs with various
object detectors for robust perception in autonomous racing [70.16043883381677]
In autonomous racing, the weather can change abruptly, causing significant degradation in perception, resulting in ineffective manoeuvres.
In order to improve detection in adverse weather, deep-learning-based models typically require extensive datasets captured in such conditions.
We introduce an approach of using synthesised adverse condition datasets in autonomous racing (generated using CycleGAN) to improve the performance of four out of five state-of-the-art detectors.
arXiv Detail & Related papers (2022-01-10T10:02:40Z) - Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of
Adverse Weather Conditions for 3D Object Detection [60.89616629421904]
Lidar-based object detectors are critical parts of the 3D perception pipeline in autonomous navigation systems such as self-driving cars.
They are sensitive to adverse weather conditions such as rain, snow and fog due to reduced signal-to-noise ratio (SNR) and signal-to-background ratio (SBR)
arXiv Detail & Related papers (2021-07-14T21:10:47Z) - Object Detection Under Rainy Conditions for Autonomous Vehicles: A
Review of State-of-the-Art and Emerging Techniques [5.33024001730262]
This paper presents a tutorial on state-of-the-art techniques for mitigating the influence of rainy conditions on an autonomous vehicle's ability to detect objects.
Our goal includes surveying and analyzing the performance of object detection methods trained and tested using visual data captured under clear and rainy conditions.
arXiv Detail & Related papers (2020-06-30T02:05:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.