Robust Vehicle Localization and Tracking in Rain using Street Maps
- URL: http://arxiv.org/abs/2409.01038v1
- Date: Mon, 2 Sep 2024 08:15:12 GMT
- Title: Robust Vehicle Localization and Tracking in Rain using Street Maps
- Authors: Yu Xiang Tan, Malika Meghjani,
- Abstract summary: We propose a novel approach for vehicle localization that uses street network based map information to correct drifting odometry estimates.
Specifically, our approach is a flexible fusion algorithm that integrates intermittent GPS, drifting IMU and VO estimates.
We robustly evaluate our proposed approach on four geographically diverse datasets from different countries.
- Score: 2.2651698012357473
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: GPS-based vehicle localization and tracking suffers from unstable positional information commonly experienced in tunnel segments and in dense urban areas. Also, both Visual Odometry (VO) and Visual Inertial Odometry (VIO) are susceptible to adverse weather conditions that causes occlusions or blur on the visual input. In this paper, we propose a novel approach for vehicle localization that uses street network based map information to correct drifting odometry estimates and intermittent GPS measurements especially, in adversarial scenarios such as driving in rain and tunnels. Specifically, our approach is a flexible fusion algorithm that integrates intermittent GPS, drifting IMU and VO estimates together with 2D map information for robust vehicle localization and tracking. We refer to our approach as Map-Fusion. We robustly evaluate our proposed approach on four geographically diverse datasets from different countries ranging across clear and rain weather conditions. These datasets also include challenging visual segments in tunnels and underpasses. We show that with the integration of the map information, our Map-Fusion algorithm reduces the error of the state-of-the-art VO and VIO approaches across all datasets. We also validate our proposed algorithm in a real-world environment and in real-time on a hardware constrained mobile robot. Map-Fusion achieved 2.46m error in clear weather and 6.05m error in rain weather for a 150m route.
Related papers
- Neural Semantic Map-Learning for Autonomous Vehicles [85.8425492858912]
We present a mapping system that fuses local submaps gathered from a fleet of vehicles at a central instance to produce a coherent map of the road environment.
Our method jointly aligns and merges the noisy and incomplete local submaps using a scene-specific Neural Signed Distance Field.
We leverage memory-efficient sparse feature-grids to scale to large areas and introduce a confidence score to model uncertainty in scene reconstruction.
arXiv Detail & Related papers (2024-10-10T10:10:03Z) - G-MEMP: Gaze-Enhanced Multimodal Ego-Motion Prediction in Driving [71.9040410238973]
We focus on inferring the ego trajectory of a driver's vehicle using their gaze data.
Next, we develop G-MEMP, a novel multimodal ego-trajectory prediction network that combines GPS and video input with gaze data.
The results show that G-MEMP significantly outperforms state-of-the-art methods in both benchmarks.
arXiv Detail & Related papers (2023-12-13T23:06:30Z) - Beyond Cross-view Image Retrieval: Highly Accurate Vehicle Localization
Using Satellite Image [91.29546868637911]
This paper addresses the problem of vehicle-mounted camera localization by matching a ground-level image with an overhead-view satellite map.
The key idea is to formulate the task as pose estimation and solve it by neural-net based optimization.
Experiments on standard autonomous vehicle localization datasets have confirmed the superiority of the proposed method.
arXiv Detail & Related papers (2022-04-10T19:16:58Z) - Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of
Adverse Weather Conditions for 3D Object Detection [60.89616629421904]
Lidar-based object detectors are critical parts of the 3D perception pipeline in autonomous navigation systems such as self-driving cars.
They are sensitive to adverse weather conditions such as rain, snow and fog due to reduced signal-to-noise ratio (SNR) and signal-to-background ratio (SBR)
arXiv Detail & Related papers (2021-07-14T21:10:47Z) - Trajectory Prediction for Autonomous Driving with Topometric Map [10.831436392239585]
State-of-the-art autonomous driving systems rely on high definition (HD) maps for localization and navigation.
We propose an end-to-end transformer networks based approach for map-less autonomous driving.
arXiv Detail & Related papers (2021-05-09T08:16:16Z) - Radar-based Automotive Localization using Landmarks in a Multimodal
Sensor Graph-based Approach [0.0]
In this paper, we address the problem of localization with automotive-grade radars.
The system uses landmarks and odometry information as an abstraction layer.
A single, semantic landmark map is used and maintained for all sensors.
arXiv Detail & Related papers (2021-04-29T07:35:20Z) - LiveMap: Real-Time Dynamic Map in Automotive Edge Computing [14.195521569220448]
LiveMap is a real-time dynamic map that detects, matches, and tracks objects on the road with crowdsourcing data from connected vehicles in sub-second.
We develop the control plane of LiveMap that allows adaptive offloading of vehicle computations.
We implement LiveMap on a small-scale testbed and develop a large-scale network simulator.
arXiv Detail & Related papers (2020-12-16T15:00:49Z) - 4Seasons: A Cross-Season Dataset for Multi-Weather SLAM in Autonomous
Driving [48.588254700810474]
We present a novel dataset covering seasonal and challenging perceptual conditions for autonomous driving.
Among others, it enables research on visual odometry, global place recognition, and map-based re-localization tracking.
arXiv Detail & Related papers (2020-09-14T12:31:20Z) - Radar-based Dynamic Occupancy Grid Mapping and Object Detection [55.74894405714851]
In recent years, the classical occupancy grid map approach has been extended to dynamic occupancy grid maps.
This paper presents the further development of a previous approach.
The data of multiple radar sensors are fused, and a grid-based object tracking and mapping method is applied.
arXiv Detail & Related papers (2020-08-09T09:26:30Z) - Traffic Prediction Framework for OpenStreetMap using Deep Learning based
Complex Event Processing and Open Traffic Cameras [4.6453787256723365]
We propose a deep learning-based Complex Event Processing (CEP) method that relies on publicly available video camera streams for traffic estimation.
The proposed framework performs near-real-time object detection and objects property extraction across camera clusters in parallel to derive multiple measures related to traffic.
The system achieves a near-real-time performance of 1.42 seconds median latency and an average F-score of 0.80.
arXiv Detail & Related papers (2020-07-12T17:10:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.