Globally Optimal Event-Based Divergence Estimation for Ventral Landing
- URL: http://arxiv.org/abs/2209.13168v1
- Date: Tue, 27 Sep 2022 06:00:52 GMT
- Title: Globally Optimal Event-Based Divergence Estimation for Ventral Landing
- Authors: Sofia McLeod, Gabriele Meoni, Dario Izzo, Anne Mergy, Daqi Liu, Yasir
Latif, Ian Reid, Tat-Jun Chin
- Abstract summary: Event sensing is a major component in bio-inspired flight guidance and control systems.
We explore the usage of event cameras for predicting time-to-contact with the surface during ventral landing.
This is achieved by estimating divergence (inverse TTC), which is the rate of radial optic flow, from the event stream generated during landing.
Our core contributions are a novel contrast maximisation formulation for event-based divergence estimation, and a branch-and-bound algorithm to exactly maximise contrast and find the optimal divergence value.
- Score: 55.29096494880328
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Event sensing is a major component in bio-inspired flight guidance and
control systems. We explore the usage of event cameras for predicting
time-to-contact (TTC) with the surface during ventral landing. This is achieved
by estimating divergence (inverse TTC), which is the rate of radial optic flow,
from the event stream generated during landing. Our core contributions are a
novel contrast maximisation formulation for event-based divergence estimation,
and a branch-and-bound algorithm to exactly maximise contrast and find the
optimal divergence value. GPU acceleration is conducted to speed up the global
algorithm. Another contribution is a new dataset containing real event streams
from ventral landing that was employed to test and benchmark our method. Owing
to global optimisation, our algorithm is much more capable at recovering the
true divergence, compared to other heuristic divergence estimators or
event-based optic flow methods. With GPU acceleration, our method also achieves
competitive runtimes.
Related papers
- AsynEIO: Asynchronous Monocular Event-Inertial Odometry Using Gaussian Process Regression [7.892365588256595]
We introduce a monocular event-inertial odometry method called AsynEIO, designed to fuse asynchronous event and inertial data.
We show that AsynEIO outperforms existing methods, especially in high-speed and low-illumination scenarios.
arXiv Detail & Related papers (2024-11-19T02:39:57Z) - EROAM: Event-based Camera Rotational Odometry and Mapping in Real-time [14.989905816510698]
EROAM is a novel event-based rotational odometry and mapping system that achieves real-time, accurate camera estimation.
We show that EROAM significantly outperforms state-of-the-art methods in terms of accuracy, robustness, and computational efficiency.
arXiv Detail & Related papers (2024-11-17T08:50:47Z) - Real-Time Polygonal Semantic Mapping for Humanoid Robot Stair Climbing [19.786955745157453]
We present a novel algorithm for real-time planar semantic mapping tailored for humanoid robots navigating complex terrains such as staircases.
We utilize an anisotropic diffusion filter on depth images to effectively minimize noise from gradient jumps while preserving essential edge details.
Our approach achieves real-time performance, processing single frames at rates exceeding $30Hz$, which facilitates detailed plane extraction and map management swiftly and efficiently.
arXiv Detail & Related papers (2024-11-04T09:34:55Z) - Adaptive Federated Learning Over the Air [108.62635460744109]
We propose a federated version of adaptive gradient methods, particularly AdaGrad and Adam, within the framework of over-the-air model training.
Our analysis shows that the AdaGrad-based training algorithm converges to a stationary point at the rate of $mathcalO( ln(T) / T 1 - frac1alpha ).
arXiv Detail & Related papers (2024-03-11T09:10:37Z) - Manifold Interpolating Optimal-Transport Flows for Trajectory Inference [64.94020639760026]
We present a method called Manifold Interpolating Optimal-Transport Flow (MIOFlow)
MIOFlow learns, continuous population dynamics from static snapshot samples taken at sporadic timepoints.
We evaluate our method on simulated data with bifurcations and merges, as well as scRNA-seq data from embryoid body differentiation, and acute myeloid leukemia treatment.
arXiv Detail & Related papers (2022-06-29T22:19:03Z) - Asynchronous Optimisation for Event-based Visual Odometry [53.59879499700895]
Event cameras open up new possibilities for robotic perception due to their low latency and high dynamic range.
We focus on event-based visual odometry (VO)
We propose an asynchronous structure-from-motion optimisation back-end.
arXiv Detail & Related papers (2022-03-02T11:28:47Z) - FAITH: Fast iterative half-plane focus of expansion estimation using
event-based optic flow [3.326320568999945]
This study proposes the FAst ITerative Half-plane (FAITH) method to determine the course of a micro air vehicle (MAV)
Results show that the computational efficiency of our solution outperforms state-of-the-art methods while keeping a high level of accuracy.
arXiv Detail & Related papers (2021-02-25T12:49:02Z) - Learning Monocular Dense Depth from Events [53.078665310545745]
Event cameras produce brightness changes in the form of a stream of asynchronous events instead of intensity frames.
Recent learning-based approaches have been applied to event-based data, such as monocular depth prediction.
We propose a recurrent architecture to solve this task and show significant improvement over standard feed-forward methods.
arXiv Detail & Related papers (2020-10-16T12:36:23Z) - End-to-end Learning for Inter-Vehicle Distance and Relative Velocity
Estimation in ADAS with a Monocular Camera [81.66569124029313]
We propose a camera-based inter-vehicle distance and relative velocity estimation method based on end-to-end training of a deep neural network.
The key novelty of our method is the integration of multiple visual clues provided by any two time-consecutive monocular frames.
We also propose a vehicle-centric sampling mechanism to alleviate the effect of perspective distortion in the motion field.
arXiv Detail & Related papers (2020-06-07T08:18:31Z) - Globally Optimal Contrast Maximisation for Event-based Motion Estimation [43.048406187129736]
We propose a new globally optimal event-based motion estimation algorithm.
Based on branch-and-bound (BnB), our method solves rotational (3DoF) motion estimation on event streams.
Our algorithm is currently able to process a 50,000 event input in 300 seconds.
arXiv Detail & Related papers (2020-02-25T05:54:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.