Adver-City: Open-Source Multi-Modal Dataset for Collaborative Perception Under Adverse Weather Conditions
- URL: http://arxiv.org/abs/2410.06380v1
- Date: Tue, 8 Oct 2024 21:26:22 GMT
- Title: Adver-City: Open-Source Multi-Modal Dataset for Collaborative Perception Under Adverse Weather Conditions
- Authors: Mateus Karvat, Sidney Givigi,
- Abstract summary: Adverse weather conditions pose a significant challenge to the widespread adoption of Autonomous Vehicles.
We introduce Adver-City, the first open-source synthetic Collaborative Perception dataset focused on adverse weather conditions.
It contains over 24 thousand frames, over 890 thousand annotations, and 110 unique scenarios across six different weather conditions.
- Score: 1.4963011898406866
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Adverse weather conditions pose a significant challenge to the widespread adoption of Autonomous Vehicles (AVs) by impacting sensors like LiDARs and cameras. Even though Collaborative Perception (CP) improves AV perception in difficult conditions, existing CP datasets lack adverse weather conditions. To address this, we introduce Adver-City, the first open-source synthetic CP dataset focused on adverse weather conditions. Simulated in CARLA with OpenCDA, it contains over 24 thousand frames, over 890 thousand annotations, and 110 unique scenarios across six different weather conditions: clear weather, soft rain, heavy rain, fog, foggy heavy rain and, for the first time in a synthetic CP dataset, glare. It has six object categories including pedestrians and cyclists, and uses data from vehicles and roadside units featuring LiDARs, RGB and semantic segmentation cameras, GNSS, and IMUs. Its scenarios, based on real crash reports, depict the most relevant road configurations for adverse weather and poor visibility conditions, varying in object density, with both dense and sparse scenes, allowing for novel testing conditions of CP models. Benchmarks run on the dataset show that weather conditions created challenging conditions for perception models, reducing multi-modal object detection performance by up to 19%, while object density affected LiDAR-based detection by up to 29%. The dataset, code and documentation are available at https://labs.cs.queensu.ca/quarrg/datasets/adver-city/.
Related papers
- Panoptic-CUDAL Technical Report: Rural Australia Point Cloud Dataset in Rainy Conditions [18.246913297418686]
We introduce the Panoptic-CUDAL dataset, a novel dataset purpose-built for panoptic segmentation in rural areas subject to rain.
By recording high-resolution LiDAR, camera, and pose data, Panoptic-CUDAL offers a diverse, information-rich dataset in a challenging scenario.
We present analysis of the recorded data and provide baseline results for panoptic and semantic segmentation methods on LiDAR point clouds.
arXiv Detail & Related papers (2025-03-20T17:41:16Z) - SCOPE: A Synthetic Multi-Modal Dataset for Collective Perception Including Physical-Correct Weather Conditions [0.5026434955540995]
SCOPE is the first synthetic multi-modal dataset that incorporates realistic camera and LiDAR models as well as parameterized and physically accurate weather simulations.
The dataset contains 17,600 frames from over 40 diverse scenarios with up to 24 collaborative agents, infrastructure sensors, and passive traffic, including cyclists and pedestrians.
arXiv Detail & Related papers (2024-08-06T09:35:50Z) - NiteDR: Nighttime Image De-Raining with Cross-View Sensor Cooperative Learning for Dynamic Driving Scenes [49.92839157944134]
In nighttime driving scenes, insufficient and uneven lighting shrouds the scenes in darkness, resulting degradation of image quality and visibility.
We develop an image de-raining framework tailored for rainy nighttime driving scenes.
It aims to remove rain artifacts, enrich scene representation, and restore useful information.
arXiv Detail & Related papers (2024-02-28T09:02:33Z) - WEDGE: A multi-weather autonomous driving dataset built from generative
vision-language models [51.61662672912017]
We introduce WEDGE: a synthetic dataset generated with a vision-language generative model via prompting.
WEDGE consists of 3360 images in 16 extreme weather conditions manually annotated with 16513 bounding boxes.
We establish baseline performance for classification and detection with 53.87% test accuracy and 45.41 mAP.
arXiv Detail & Related papers (2023-05-12T14:42:47Z) - AVisT: A Benchmark for Visual Object Tracking in Adverse Visibility [125.77396380698639]
AVisT is a benchmark for visual tracking in diverse scenarios with adverse visibility.
AVisT comprises 120 challenging sequences with 80k annotated frames, spanning 18 diverse scenarios.
We benchmark 17 popular and recent trackers on AVisT with detailed analysis of their tracking performance across attributes.
arXiv Detail & Related papers (2022-08-14T17:49:37Z) - Ithaca365: Dataset and Driving Perception under Repeated and Challenging
Weather Conditions [0.0]
We present a new dataset to enable robust autonomous driving via a novel data collection process.
The dataset includes images and point clouds from cameras and LiDAR sensors, along with high-precision GPS/INS.
We demonstrate the uniqueness of this dataset by analyzing the performance of baselines in amodal segmentation of road and objects.
arXiv Detail & Related papers (2022-08-01T22:55:32Z) - Vision in adverse weather: Augmentation using CycleGANs with various
object detectors for robust perception in autonomous racing [70.16043883381677]
In autonomous racing, the weather can change abruptly, causing significant degradation in perception, resulting in ineffective manoeuvres.
In order to improve detection in adverse weather, deep-learning-based models typically require extensive datasets captured in such conditions.
We introduce an approach of using synthesised adverse condition datasets in autonomous racing (generated using CycleGAN) to improve the performance of four out of five state-of-the-art detectors.
arXiv Detail & Related papers (2022-01-10T10:02:40Z) - TransWeather: Transformer-based Restoration of Images Degraded by
Adverse Weather Conditions [77.20136060506906]
We propose TransWeather, a transformer-based end-to-end model with just a single encoder and a decoder.
TransWeather achieves significant improvements across multiple test datasets over both All-in-One network.
It is validated on real world test images and found to be more effective than previous methods.
arXiv Detail & Related papers (2021-11-29T18:57:09Z) - Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of
Adverse Weather Conditions for 3D Object Detection [60.89616629421904]
Lidar-based object detectors are critical parts of the 3D perception pipeline in autonomous navigation systems such as self-driving cars.
They are sensitive to adverse weather conditions such as rain, snow and fog due to reduced signal-to-noise ratio (SNR) and signal-to-background ratio (SBR)
arXiv Detail & Related papers (2021-07-14T21:10:47Z) - Robustness of Object Detectors in Degrading Weather Conditions [7.91378990016322]
State-of-the-art object detection systems for autonomous driving achieve promising results in clear weather conditions.
These systems need to work in degrading weather conditions, such as rain, fog and snow.
Most approaches evaluate only on the KITTI dataset, which consists only of clear weather scenes.
arXiv Detail & Related papers (2021-06-16T13:56:07Z) - 4Seasons: A Cross-Season Dataset for Multi-Weather SLAM in Autonomous
Driving [48.588254700810474]
We present a novel dataset covering seasonal and challenging perceptual conditions for autonomous driving.
Among others, it enables research on visual odometry, global place recognition, and map-based re-localization tracking.
arXiv Detail & Related papers (2020-09-14T12:31:20Z) - DAWN: Vehicle Detection in Adverse Weather Nature Dataset [4.09920839425892]
We present a new dataset consisting of real-world images collected under various adverse weather conditions called DAWN.
The dataset comprises a collection of 1000 images from real-traffic environments, which are divided into four sets of weather conditions: fog, snow, rain and sandstorms.
This data helps interpreting effects caused by the adverse weather conditions on the performance of vehicle detection systems.
arXiv Detail & Related papers (2020-08-12T15:48:49Z) - Canadian Adverse Driving Conditions Dataset [8.428999369859318]
The Canadian Adverse Driving Conditions dataset was collected with the Autonomoose autonomous vehicle platform.
The dataset is the first autonomous vehicle dataset that focuses on adverse driving conditions specifically.
It contains 7,000 frames collected through a variety of winter weather conditions of annotated data from 8 cameras.
arXiv Detail & Related papers (2020-01-27T23:21:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.