Uncertainty-Driven Radar-Inertial Fusion for Instantaneous 3D Ego-Velocity Estimation
- URL: http://arxiv.org/abs/2506.14294v1
- Date: Tue, 17 Jun 2025 08:10:39 GMT
- Title: Uncertainty-Driven Radar-Inertial Fusion for Instantaneous 3D Ego-Velocity Estimation
- Authors: Prashant Kumar Rai, Elham Kowsari, Nataliya Strokina, Reza Ghabcheloo,
- Abstract summary: We present a method for estimating ego-velocity in autonomous navigation by integrating high-resolution imaging radar with an inertial measurement unit.<n>We employ a neural network to process complex-valued raw radar data and estimate instantaneous linear ego-velocity along with its associated uncertainty.<n>This uncertainty-aware velocity estimate is then integrated with inertial measurement unit data using an Extended Kalman Filter.
- Score: 4.184845027588594
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present a method for estimating ego-velocity in autonomous navigation by integrating high-resolution imaging radar with an inertial measurement unit. The proposed approach addresses the limitations of traditional radar-based ego-motion estimation techniques by employing a neural network to process complex-valued raw radar data and estimate instantaneous linear ego-velocity along with its associated uncertainty. This uncertainty-aware velocity estimate is then integrated with inertial measurement unit data using an Extended Kalman Filter. The filter leverages the network-predicted uncertainty to refine the inertial sensor's noise and bias parameters, improving the overall robustness and accuracy of the ego-motion estimation. We evaluated the proposed method on the publicly available ColoRadar dataset. Our approach achieves significantly lower error compared to the closest publicly available method and also outperforms both instantaneous and scan matching-based techniques.
Related papers
- Age of Information Minimization in UAV-Enabled Integrated Sensing and Communication Systems [34.92822911897626]
Unmanned aerial vehicles (UAVs) equipped with integrated sensing and communication (ISAC) capabilities are envisioned to play a pivotal role in future wireless networks.<n>We propose Age Information (AoI) system that simultaneously performs target sensing and multi-user communication.
arXiv Detail & Related papers (2025-07-18T18:17:09Z) - TacoDepth: Towards Efficient Radar-Camera Depth Estimation with One-stage Fusion [54.46664104437454]
We propose TacoDepth, an efficient and accurate Radar-Camera depth estimation model with one-stage fusion.<n>Specifically, the graph-based Radar structure extractor and the pyramid-based Radar fusion module are designed.<n>Compared with the previous state-of-the-art approach, TacoDepth improves depth accuracy and processing speed by 12.8% and 91.8%.
arXiv Detail & Related papers (2025-04-16T05:25:04Z) - Continuously Optimizing Radar Placement with Model Predictive Path Integrals [16.148347437965683]
Continuously optimizing sensor placement is essential for precise target localization in various military and civilian applications.<n>We employ a range measurement model that incorporates radar parameters and radar-target distance.<n>We visualize the evolving geometry of radars and targets over time, highlighting areas of highest measurement information gain.
arXiv Detail & Related papers (2024-05-29T11:25:53Z) - Depth Estimation fusing Image and Radar Measurements with Uncertain Directions [14.206589791912458]
In prior radar-image fusion work, image features are merged with the uncertain sparse depths measured by radar through convolutional layers.
Our method avoids this problem by computing features only with an image and conditioning the features pixelwise with the radar depth.
Our method improves training data by learning only these possibly correct radar directions, while the previous method trains raw radar measurements.
arXiv Detail & Related papers (2024-03-23T10:16:36Z) - AI-Based Energy Transportation Safety: Pipeline Radial Threat Estimation
Using Intelligent Sensing System [52.93806509364342]
This paper proposes a radial threat estimation method for energy pipelines based on distributed optical fiber sensing technology.
We introduce a continuous multi-view and multi-domain feature fusion methodology to extract comprehensive signal features.
We incorporate the concept of transfer learning through a pre-trained model, enhancing both recognition accuracy and training efficiency.
arXiv Detail & Related papers (2023-12-18T12:37:35Z) - Semantic Segmentation of Radar Detections using Convolutions on Point
Clouds [59.45414406974091]
We introduce a deep-learning based method to convolve radar detections into point clouds.
We adapt this algorithm to radar-specific properties through distance-dependent clustering and pre-processing of input point clouds.
Our network outperforms state-of-the-art approaches that are based on PointNet++ on the task of semantic segmentation of radar point clouds.
arXiv Detail & Related papers (2023-05-22T07:09:35Z) - R4Dyn: Exploring Radar for Self-Supervised Monocular Depth Estimation of
Dynamic Scenes [69.6715406227469]
Self-supervised monocular depth estimation in driving scenarios has achieved comparable performance to supervised approaches.
We present R4Dyn, a novel set of techniques to use cost-efficient radar data on top of a self-supervised depth estimation framework.
arXiv Detail & Related papers (2021-08-10T17:57:03Z) - Efficient and Robust LiDAR-Based End-to-End Navigation [132.52661670308606]
We present an efficient and robust LiDAR-based end-to-end navigation framework.
We propose Fast-LiDARNet that is based on sparse convolution kernel optimization and hardware-aware model design.
We then propose Hybrid Evidential Fusion that directly estimates the uncertainty of the prediction from only a single forward pass.
arXiv Detail & Related papers (2021-05-20T17:52:37Z) - Deep Evaluation Metric: Learning to Evaluate Simulated Radar Point
Clouds for Virtual Testing of Autonomous Driving [0.0]
The usage of environment sensor models for virtual testing is a promising approach to reduce the testing effort of autonomous driving.
In this work, we train a neural network to distinguish real and simulated radar sensor data.
We propose the classifier's confidence score for the real radar point cloud' class as a metric to determine the degree of fidelity of synthetically generated radar data.
arXiv Detail & Related papers (2021-04-14T11:04:50Z) - LiRaNet: End-to-End Trajectory Prediction using Spatio-Temporal Radar
Fusion [52.59664614744447]
We present LiRaNet, a novel end-to-end trajectory prediction method which utilizes radar sensor information along with widely used lidar and high definition (HD) maps.
automotive radar provides rich, complementary information, allowing for longer range vehicle detection as well as instantaneous velocity measurements.
arXiv Detail & Related papers (2020-10-02T00:13:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.