Robust Vehicle Localization and Tracking in Rain using Street Maps
- URL: http://arxiv.org/abs/2409.01038v1
- Date: Mon, 2 Sep 2024 08:15:12 GMT
- Title: Robust Vehicle Localization and Tracking in Rain using Street Maps
- Authors: Yu Xiang Tan, Malika Meghjani,
- Abstract summary: We propose a novel approach for vehicle localization that uses street network based map information to correct drifting odometry estimates.
Specifically, our approach is a flexible fusion algorithm that integrates intermittent GPS, drifting IMU and VO estimates.
We robustly evaluate our proposed approach on four geographically diverse datasets from different countries.
- Score: 2.2651698012357473
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: GPS-based vehicle localization and tracking suffers from unstable positional information commonly experienced in tunnel segments and in dense urban areas. Also, both Visual Odometry (VO) and Visual Inertial Odometry (VIO) are susceptible to adverse weather conditions that causes occlusions or blur on the visual input. In this paper, we propose a novel approach for vehicle localization that uses street network based map information to correct drifting odometry estimates and intermittent GPS measurements especially, in adversarial scenarios such as driving in rain and tunnels. Specifically, our approach is a flexible fusion algorithm that integrates intermittent GPS, drifting IMU and VO estimates together with 2D map information for robust vehicle localization and tracking. We refer to our approach as Map-Fusion. We robustly evaluate our proposed approach on four geographically diverse datasets from different countries ranging across clear and rain weather conditions. These datasets also include challenging visual segments in tunnels and underpasses. We show that with the integration of the map information, our Map-Fusion algorithm reduces the error of the state-of-the-art VO and VIO approaches across all datasets. We also validate our proposed algorithm in a real-world environment and in real-time on a hardware constrained mobile robot. Map-Fusion achieved 2.46m error in clear weather and 6.05m error in rain weather for a 150m route.
Related papers
- Pole-based Vehicle Localization with Vector Maps: A Camera-LiDAR Comparative Study [6.300346102366891]
In road environments, many common furniture such as traffic signs, traffic lights and street lights take the form of poles.
This paper introduces a real-time method for camera-based pole detection using a lightweight neural network trained on automatically annotated images.
The results highlight the high accuracy of the vision-based approach in open road conditions.
arXiv Detail & Related papers (2024-12-11T09:05:05Z) - Real-Time Metric-Semantic Mapping for Autonomous Navigation in Outdoor Environments [18.7565126823704]
We introduce an online metric-semantic mapping system that generates a global metric-semantic mesh map of large-scale outdoor environments.
Our mapping process achieves exceptional speed, with frame processing taking less than 7ms, regardless of scenario scale.
We integrate the resultant map into a real-world navigation system, enabling metric-semantic-based terrain assessment and autonomous point-to-point navigation within a campus environment.
arXiv Detail & Related papers (2024-11-30T00:05:10Z) - TopoSD: Topology-Enhanced Lane Segment Perception with SDMap Prior [70.84644266024571]
We propose to train a perception model to "see" standard definition maps (SDMaps)
We encode SDMap elements into neural spatial map representations and instance tokens, and then incorporate such complementary features as prior information.
Based on the lane segment representation framework, the model simultaneously predicts lanes, centrelines and their topology.
arXiv Detail & Related papers (2024-11-22T06:13:42Z) - Neural Semantic Map-Learning for Autonomous Vehicles [85.8425492858912]
We present a mapping system that fuses local submaps gathered from a fleet of vehicles at a central instance to produce a coherent map of the road environment.
Our method jointly aligns and merges the noisy and incomplete local submaps using a scene-specific Neural Signed Distance Field.
We leverage memory-efficient sparse feature-grids to scale to large areas and introduce a confidence score to model uncertainty in scene reconstruction.
arXiv Detail & Related papers (2024-10-10T10:10:03Z) - G-MEMP: Gaze-Enhanced Multimodal Ego-Motion Prediction in Driving [71.9040410238973]
We focus on inferring the ego trajectory of a driver's vehicle using their gaze data.
Next, we develop G-MEMP, a novel multimodal ego-trajectory prediction network that combines GPS and video input with gaze data.
The results show that G-MEMP significantly outperforms state-of-the-art methods in both benchmarks.
arXiv Detail & Related papers (2023-12-13T23:06:30Z) - Beyond Cross-view Image Retrieval: Highly Accurate Vehicle Localization
Using Satellite Image [91.29546868637911]
This paper addresses the problem of vehicle-mounted camera localization by matching a ground-level image with an overhead-view satellite map.
The key idea is to formulate the task as pose estimation and solve it by neural-net based optimization.
Experiments on standard autonomous vehicle localization datasets have confirmed the superiority of the proposed method.
arXiv Detail & Related papers (2022-04-10T19:16:58Z) - Trajectory Prediction for Autonomous Driving with Topometric Map [10.831436392239585]
State-of-the-art autonomous driving systems rely on high definition (HD) maps for localization and navigation.
We propose an end-to-end transformer networks based approach for map-less autonomous driving.
arXiv Detail & Related papers (2021-05-09T08:16:16Z) - Radar-based Automotive Localization using Landmarks in a Multimodal
Sensor Graph-based Approach [0.0]
In this paper, we address the problem of localization with automotive-grade radars.
The system uses landmarks and odometry information as an abstraction layer.
A single, semantic landmark map is used and maintained for all sensors.
arXiv Detail & Related papers (2021-04-29T07:35:20Z) - LiveMap: Real-Time Dynamic Map in Automotive Edge Computing [14.195521569220448]
LiveMap is a real-time dynamic map that detects, matches, and tracks objects on the road with crowdsourcing data from connected vehicles in sub-second.
We develop the control plane of LiveMap that allows adaptive offloading of vehicle computations.
We implement LiveMap on a small-scale testbed and develop a large-scale network simulator.
arXiv Detail & Related papers (2020-12-16T15:00:49Z) - 4Seasons: A Cross-Season Dataset for Multi-Weather SLAM in Autonomous
Driving [48.588254700810474]
We present a novel dataset covering seasonal and challenging perceptual conditions for autonomous driving.
Among others, it enables research on visual odometry, global place recognition, and map-based re-localization tracking.
arXiv Detail & Related papers (2020-09-14T12:31:20Z) - Radar-based Dynamic Occupancy Grid Mapping and Object Detection [55.74894405714851]
In recent years, the classical occupancy grid map approach has been extended to dynamic occupancy grid maps.
This paper presents the further development of a previous approach.
The data of multiple radar sensors are fused, and a grid-based object tracking and mapping method is applied.
arXiv Detail & Related papers (2020-08-09T09:26:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.