FLIVVER: Fly Lobula Inspired Visual Velocity Estimation & Ranging
- URL: http://arxiv.org/abs/2004.05247v1
- Date: Fri, 10 Apr 2020 22:35:13 GMT
- Title: FLIVVER: Fly Lobula Inspired Visual Velocity Estimation & Ranging
- Authors: Bryson Lingenfelter, Arunava Nag, and Floris van Breugel
- Abstract summary: A tiny insect or insect-sized robot could estimate its absolute velocity and distance to nearby objects remains unknown.
We present a novel algorithm, FLIVVER, which combines the geometry of dynamic forward motion with inspiration from insect visual processing.
Our algorithm provides a clear hypothesis for how insects might estimate absolute velocity, and also provides a theoretical framework for designing fast analog circuitry for efficient state estimation.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The mechanism by which a tiny insect or insect-sized robot could estimate its
absolute velocity and distance to nearby objects remains unknown. However, this
ability is critical for behaviors that require estimating wind direction during
flight, such as odor-plume tracking. Neuroscience and behavior studies with
insects have shown that they rely on the perception of image motion, or optic
flow, to estimate relative motion, equivalent to a ratio of their velocity and
distance to objects in the world. The key open challenge is therefore to
decouple these two states from a single measurement of their ratio. Although
modern SLAM (Simultaneous Localization and Mapping) methods provide a solution
to this problem for robotic systems, these methods typically rely on
computations that insects likely cannot perform, such as simultaneously
tracking multiple individual visual features, remembering a 3D map of the
world, and solving nonlinear optimization problems using iterative algorithms.
Here we present a novel algorithm, FLIVVER, which combines the geometry of
dynamic forward motion with inspiration from insect visual processing to
\textit{directly} estimate absolute ground velocity from a combination of optic
flow and acceleration information. Our algorithm provides a clear hypothesis
for how insects might estimate absolute velocity, and also provides a
theoretical framework for designing fast analog circuitry for efficient state
estimation, which could be applied to insect-sized robots.
Related papers
- Ultrafast vision perception by neuromorphic optical flow [1.1980928503177917]
3D neuromorphic optical flow method embeds external motion features directly into hardware.
In our demonstration, this approach reduces visual data processing time by an average of 0.3 seconds.
Neuromorphic optical flow algorithm's flexibility allows seamless integration with existing algorithms.
arXiv Detail & Related papers (2024-09-10T10:59:32Z) - Neural Implicit Swept Volume Models for Fast Collision Detection [0.0]
We present an algorithm combining the speed of the deep learning-based signed distance computations with the strong accuracy guarantees of geometric collision checkers.
We validate our approach in simulated and real-world robotic experiments, and demonstrate that it is able to speed up a commercial bin picking application.
arXiv Detail & Related papers (2024-02-23T12:06:48Z) - Correlating sparse sensing for large-scale traffic speed estimation: A
Laplacian-enhanced low-rank tensor kriging approach [76.45949280328838]
We propose a Laplacian enhanced low-rank tensor (LETC) framework featuring both lowrankness and multi-temporal correlations for large-scale traffic speed kriging.
We then design an efficient solution algorithm via several effective numeric techniques to scale up the proposed model to network-wide kriging.
arXiv Detail & Related papers (2022-10-21T07:25:57Z) - The Right Spin: Learning Object Motion from Rotation-Compensated Flow
Fields [61.664963331203666]
How humans perceive moving objects is a longstanding research question in computer vision.
One approach to the problem is to teach a deep network to model all of these effects.
We present a novel probabilistic model to estimate the camera's rotation given the motion field.
arXiv Detail & Related papers (2022-02-28T22:05:09Z) - FMODetect: Robust Detection and Trajectory Estimation of Fast Moving
Objects [110.29738581961955]
We propose the first learning-based approach for detection and trajectory estimation of fast moving objects.
The proposed method first detects all fast moving objects as a truncated distance function to the trajectory.
For the sharp appearance estimation, we propose an energy minimization based deblurring.
arXiv Detail & Related papers (2020-12-15T11:05:34Z) - Fast Motion Understanding with Spatiotemporal Neural Networks and
Dynamic Vision Sensors [99.94079901071163]
This paper presents a Dynamic Vision Sensor (DVS) based system for reasoning about high speed motion.
We consider the case of a robot at rest reacting to a small, fast approaching object at speeds higher than 15m/s.
We highlight the results of our system to a toy dart moving at 23.4m/s with a 24.73deg error in $theta$, 18.4mm average discretized radius prediction error, and 25.03% median time to collision prediction error.
arXiv Detail & Related papers (2020-11-18T17:55:07Z) - Map-Based Temporally Consistent Geolocalization through Learning Motion
Trajectories [0.5076419064097732]
We propose a novel trajectory learning method that exploits motion trajectories on topological map using recurrent neural network.
Inspired by human's ability to both be aware of distance and direction of self-motion in navigation, our trajectory learning method learns a pattern representation of trajectories encoded as a sequence of distances and turning angles to assist self-localization.
arXiv Detail & Related papers (2020-10-13T02:08:45Z) - Appearance-free Tripartite Matching for Multiple Object Tracking [6.165592821539306]
Multiple Object Tracking (MOT) detects the trajectories of multiple objects given an input video.
Most existing algorithms depend on the uniqueness of the object's appearance, and the dominating bipartite matching scheme ignores the speed smoothness.
We propose an appearance-free tripartite matching to avoid the irregular velocity problem of the bipartite matching.
arXiv Detail & Related papers (2020-08-09T02:16:44Z) - Wave Propagation of Visual Stimuli in Focus of Attention [77.4747032928547]
Fast reactions to changes in the surrounding visual environment require efficient attention mechanisms to reallocate computational resources to most relevant locations in the visual field.
We present a biologically-plausible model of focus of attention that exhibits effectiveness and efficiency exhibited by foveated animals.
arXiv Detail & Related papers (2020-06-19T09:33:21Z) - End-to-end Learning for Inter-Vehicle Distance and Relative Velocity
Estimation in ADAS with a Monocular Camera [81.66569124029313]
We propose a camera-based inter-vehicle distance and relative velocity estimation method based on end-to-end training of a deep neural network.
The key novelty of our method is the integration of multiple visual clues provided by any two time-consecutive monocular frames.
We also propose a vehicle-centric sampling mechanism to alleviate the effect of perspective distortion in the motion field.
arXiv Detail & Related papers (2020-06-07T08:18:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.