Planar Velocity Estimation for Fast-Moving Mobile Robots Using Event-Based Optical Flow
- URL: http://arxiv.org/abs/2505.11116v1
- Date: Fri, 16 May 2025 11:00:33 GMT
- Title: Planar Velocity Estimation for Fast-Moving Mobile Robots Using Event-Based Optical Flow
- Authors: Liam Boyle, Jonas Kühne, Nicolas Baumann, Niklas Bastuck, Michele Magno,
- Abstract summary: We introduce an approach to velocity estimation that is decoupled from wheel-to-surface traction assumptions.<n>The proposed method is evaluated through in-field experiments on a 1:10 scale autonomous racing platform.
- Score: 1.4447019135112429
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Accurate velocity estimation is critical in mobile robotics, particularly for driver assistance systems and autonomous driving. Wheel odometry fused with Inertial Measurement Unit (IMU) data is a widely used method for velocity estimation; however, it typically requires strong assumptions, such as non-slip steering, or complex vehicle dynamics models that do not hold under varying environmental conditions like slippery surfaces. We introduce an approach to velocity estimation that is decoupled from wheel-to-surface traction assumptions by leveraging planar kinematics in combination with optical flow from event cameras pointed perpendicularly at the ground. The asynchronous micro-second latency and high dynamic range of event cameras make them highly robust to motion blur, a common challenge in vision-based perception techniques for autonomous driving. The proposed method is evaluated through in-field experiments on a 1:10 scale autonomous racing platform and compared to precise motion capture data, demonstrating not only performance on par with the state-of-the-art Event-VIO method but also a 38.3 % improvement in lateral error. Qualitative experiments at highway speeds of up to 32 m/s further confirm the effectiveness of our approach, indicating significant potential for real-world deployment.
Related papers
- NOVA: Navigation via Object-Centric Visual Autonomy for High-Speed Target Tracking in Unstructured GPS-Denied Environments [56.35569661650558]
We introduce NOVA, a fully onboard, object-centric framework that enables robust target tracking and collision-aware navigation.<n>Rather than constructing a global map, NOVA formulates perception, estimation, and control entirely in the target's reference frame.<n>We validate NOVA across challenging real-world scenarios, including urban mazes, forest trails, and repeated transitions through buildings with intermittent GPS loss.
arXiv Detail & Related papers (2025-06-23T14:28:30Z) - Radar and Event Camera Fusion for Agile Robot Ego-Motion Estimation [27.282729603784496]
We propose an IMU-free and feature-association-free framework to achieve aggressive ego-motion velocity estimation of a robot platform.<n>We use instantaneous raw events and Doppler measurements to derive rotational and translational velocities directly.<n>In the back-end, we propose a continuous-time state-space model to fuse the hybrid time-based and event-based measurements to estimate the ego-motion velocity in a fixed-lagged smoother fashion.
arXiv Detail & Related papers (2025-06-23T09:27:22Z) - Distance Estimation in Outdoor Driving Environments Using Phase-only Correlation Method with Event Cameras [5.690128924544198]
We present a method for distance estimation using a monocular event camera and a roadside LED bar.<n>The proposed approach achieves over 90% success rate with less than 0.5-meter error for distances ranging from 20 to 60 meters.<n>Future work includes extending this method to full position estimation by leveraging infrastructure such as smart poles equipped with LEDs.
arXiv Detail & Related papers (2025-05-23T07:44:33Z) - Event-Based Tracking Any Point with Motion-Augmented Temporal Consistency [58.719310295870024]
This paper presents an event-based framework for tracking any point.<n>It tackles the challenges posed by spatial sparsity and motion sensitivity in events.<n>It achieves 150% faster processing with competitive model parameters.
arXiv Detail & Related papers (2024-12-02T09:13:29Z) - Event-Aided Time-to-Collision Estimation for Autonomous Driving [28.13397992839372]
We present a novel method that estimates the time to collision using a neuromorphic event-based camera.
The proposed algorithm consists of a two-step approach for efficient and accurate geometric model fitting on event data.
Experiments on both synthetic and real data demonstrate the effectiveness of the proposed method.
arXiv Detail & Related papers (2024-07-10T02:37:36Z) - Tight Fusion of Events and Inertial Measurements for Direct Velocity
Estimation [20.002238735553792]
We propose a novel solution to tight visual-inertial fusion directly at the level of first-order kinematics by employing a dynamic vision sensor instead of a normal camera.
We demonstrate how velocity estimates in highly dynamic situations can be obtained over short time intervals.
Experiments on both simulated and real data demonstrate that the proposed tight event-inertial fusion leads to continuous and reliable velocity estimation.
arXiv Detail & Related papers (2024-01-17T15:56:57Z) - Learning Terrain-Aware Kinodynamic Model for Autonomous Off-Road Rally
Driving With Model Predictive Path Integral Control [4.23755398158039]
We propose a method for learning terrain-aware kinodynamic model conditioned on both proprioceptive and exteroceptive information.
The proposed model generates reliable predictions of 6-degree-of-freedom motion and can even estimate contact interactions.
We demonstrate the effectiveness of our approach through experiments on a simulated off-road track, showing that our proposed model-controller pair outperforms the baseline.
arXiv Detail & Related papers (2023-05-01T06:09:49Z) - Motion Planning and Control for Multi Vehicle Autonomous Racing at High
Speeds [100.61456258283245]
This paper presents a multi-layer motion planning and control architecture for autonomous racing.
The proposed solution has been applied on a Dallara AV-21 racecar and tested at oval race tracks achieving lateral accelerations up to 25 $m/s2$.
arXiv Detail & Related papers (2022-07-22T15:16:54Z) - StreamYOLO: Real-time Object Detection for Streaming Perception [84.2559631820007]
We endow the models with the capacity of predicting the future, significantly improving the results for streaming perception.
We consider multiple velocities driving scene and propose Velocity-awared streaming AP (VsAP) to jointly evaluate the accuracy.
Our simple method achieves the state-of-the-art performance on Argoverse-HD dataset and improves the sAP and VsAP by 4.7% and 8.2% respectively.
arXiv Detail & Related papers (2022-07-21T12:03:02Z) - Real Time Monocular Vehicle Velocity Estimation using Synthetic Data [78.85123603488664]
We look at the problem of estimating the velocity of road vehicles from a camera mounted on a moving car.
We propose a two-step approach where first an off-the-shelf tracker is used to extract vehicle bounding boxes and then a small neural network is used to regress the vehicle velocity.
arXiv Detail & Related papers (2021-09-16T13:10:27Z) - Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of
Adverse Weather Conditions for 3D Object Detection [60.89616629421904]
Lidar-based object detectors are critical parts of the 3D perception pipeline in autonomous navigation systems such as self-driving cars.
They are sensitive to adverse weather conditions such as rain, snow and fog due to reduced signal-to-noise ratio (SNR) and signal-to-background ratio (SBR)
arXiv Detail & Related papers (2021-07-14T21:10:47Z) - End-to-end Learning for Inter-Vehicle Distance and Relative Velocity
Estimation in ADAS with a Monocular Camera [81.66569124029313]
We propose a camera-based inter-vehicle distance and relative velocity estimation method based on end-to-end training of a deep neural network.
The key novelty of our method is the integration of multiple visual clues provided by any two time-consecutive monocular frames.
We also propose a vehicle-centric sampling mechanism to alleviate the effect of perspective distortion in the motion field.
arXiv Detail & Related papers (2020-06-07T08:18:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.