Globally Optimal Event-Based Divergence Estimation for Ventral Landing
- URL: http://arxiv.org/abs/2209.13168v1
- Date: Tue, 27 Sep 2022 06:00:52 GMT
- Title: Globally Optimal Event-Based Divergence Estimation for Ventral Landing
- Authors: Sofia McLeod, Gabriele Meoni, Dario Izzo, Anne Mergy, Daqi Liu, Yasir
Latif, Ian Reid, Tat-Jun Chin
- Abstract summary: Event sensing is a major component in bio-inspired flight guidance and control systems.
We explore the usage of event cameras for predicting time-to-contact with the surface during ventral landing.
This is achieved by estimating divergence (inverse TTC), which is the rate of radial optic flow, from the event stream generated during landing.
Our core contributions are a novel contrast maximisation formulation for event-based divergence estimation, and a branch-and-bound algorithm to exactly maximise contrast and find the optimal divergence value.
- Score: 55.29096494880328
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Event sensing is a major component in bio-inspired flight guidance and
control systems. We explore the usage of event cameras for predicting
time-to-contact (TTC) with the surface during ventral landing. This is achieved
by estimating divergence (inverse TTC), which is the rate of radial optic flow,
from the event stream generated during landing. Our core contributions are a
novel contrast maximisation formulation for event-based divergence estimation,
and a branch-and-bound algorithm to exactly maximise contrast and find the
optimal divergence value. GPU acceleration is conducted to speed up the global
algorithm. Another contribution is a new dataset containing real event streams
from ventral landing that was employed to test and benchmark our method. Owing
to global optimisation, our algorithm is much more capable at recovering the
true divergence, compared to other heuristic divergence estimators or
event-based optic flow methods. With GPU acceleration, our method also achieves
competitive runtimes.
Related papers
- Adaptive Federated Learning Over the Air [108.62635460744109]
We propose a federated version of adaptive gradient methods, particularly AdaGrad and Adam, within the framework of over-the-air model training.
Our analysis shows that the AdaGrad-based training algorithm converges to a stationary point at the rate of $mathcalO( ln(T) / T 1 - frac1alpha ).
arXiv Detail & Related papers (2024-03-11T09:10:37Z) - Streaming Factor Trajectory Learning for Temporal Tensor Decomposition [33.18423605559094]
We propose Streaming Factor Trajectory Learning for temporal tensor decomposition.
We use Gaussian processes (GPs) to model the trajectory of factors so as to flexibly estimate their temporal evolution.
We have shown the advantage of SFTL in both synthetic tasks and real-world applications.
arXiv Detail & Related papers (2023-10-25T21:58:52Z) - Fast Event-based Optical Flow Estimation by Triplet Matching [13.298845944779108]
Event cameras offer advantages over traditional cameras (low latency, high dynamic range, low power, etc.)
Optical flow estimation methods that work on packets of events trade off speed for accuracy.
We propose a novel optical flow estimation scheme based on triplet matching.
arXiv Detail & Related papers (2022-12-23T09:12:16Z) - Manifold Interpolating Optimal-Transport Flows for Trajectory Inference [64.94020639760026]
We present a method called Manifold Interpolating Optimal-Transport Flow (MIOFlow)
MIOFlow learns, continuous population dynamics from static snapshot samples taken at sporadic timepoints.
We evaluate our method on simulated data with bifurcations and merges, as well as scRNA-seq data from embryoid body differentiation, and acute myeloid leukemia treatment.
arXiv Detail & Related papers (2022-06-29T22:19:03Z) - Asynchronous Optimisation for Event-based Visual Odometry [53.59879499700895]
Event cameras open up new possibilities for robotic perception due to their low latency and high dynamic range.
We focus on event-based visual odometry (VO)
We propose an asynchronous structure-from-motion optimisation back-end.
arXiv Detail & Related papers (2022-03-02T11:28:47Z) - hARMS: A Hardware Acceleration Architecture for Real-Time Event-Based
Optical Flow [0.0]
Event-based vision sensors produce asynchronous event streams with high temporal resolution based on changes in the visual scene.
Existing solutions for calculating optical flow from event data fail to capture the true direction of motion due to the aperture problem.
We present a hardware realization of the fARMS algorithm allowing for real-time computation of true flow on low-power, embedded platforms.
arXiv Detail & Related papers (2021-12-13T16:27:17Z) - FAITH: Fast iterative half-plane focus of expansion estimation using
event-based optic flow [3.326320568999945]
This study proposes the FAst ITerative Half-plane (FAITH) method to determine the course of a micro air vehicle (MAV)
Results show that the computational efficiency of our solution outperforms state-of-the-art methods while keeping a high level of accuracy.
arXiv Detail & Related papers (2021-02-25T12:49:02Z) - Learning Monocular Dense Depth from Events [53.078665310545745]
Event cameras produce brightness changes in the form of a stream of asynchronous events instead of intensity frames.
Recent learning-based approaches have been applied to event-based data, such as monocular depth prediction.
We propose a recurrent architecture to solve this task and show significant improvement over standard feed-forward methods.
arXiv Detail & Related papers (2020-10-16T12:36:23Z) - End-to-end Learning for Inter-Vehicle Distance and Relative Velocity
Estimation in ADAS with a Monocular Camera [81.66569124029313]
We propose a camera-based inter-vehicle distance and relative velocity estimation method based on end-to-end training of a deep neural network.
The key novelty of our method is the integration of multiple visual clues provided by any two time-consecutive monocular frames.
We also propose a vehicle-centric sampling mechanism to alleviate the effect of perspective distortion in the motion field.
arXiv Detail & Related papers (2020-06-07T08:18:31Z) - Globally Optimal Contrast Maximisation for Event-based Motion Estimation [43.048406187129736]
We propose a new globally optimal event-based motion estimation algorithm.
Based on branch-and-bound (BnB), our method solves rotational (3DoF) motion estimation on event streams.
Our algorithm is currently able to process a 50,000 event input in 300 seconds.
arXiv Detail & Related papers (2020-02-25T05:54:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.