hARMS: A Hardware Acceleration Architecture for Real-Time Event-Based
Optical Flow
- URL: http://arxiv.org/abs/2112.06772v1
- Date: Mon, 13 Dec 2021 16:27:17 GMT
- Title: hARMS: A Hardware Acceleration Architecture for Real-Time Event-Based
Optical Flow
- Authors: Daniel C. Stumpp, Himanshu Akolkar, Alan D. George, Ryad B. Benosman
- Abstract summary: Event-based vision sensors produce asynchronous event streams with high temporal resolution based on changes in the visual scene.
Existing solutions for calculating optical flow from event data fail to capture the true direction of motion due to the aperture problem.
We present a hardware realization of the fARMS algorithm allowing for real-time computation of true flow on low-power, embedded platforms.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Event-based vision sensors produce asynchronous event streams with high
temporal resolution based on changes in the visual scene. The properties of
these sensors allow for accurate and fast calculation of optical flow as events
are generated. Existing solutions for calculating optical flow from event data
either fail to capture the true direction of motion due to the aperture
problem, do not use the high temporal resolution of the sensor, or are too
computationally expensive to be run in real time on embedded platforms. In this
research, we first present a faster version of our previous algorithm, ARMS
(Aperture Robust Multi-Scale flow). The new optimized software version (fARMS)
significantly improves throughput on a traditional CPU. Further, we present
hARMS, a hardware realization of the fARMS algorithm allowing for real-time
computation of true flow on low-power, embedded platforms. The proposed hARMS
architecture targets hybrid system-on-chip devices and was designed to maximize
configurability and throughput. The hardware architecture and fARMS algorithm
were developed with asynchronous neuromorphic processing in mind, abandoning
the common use of an event frame and instead operating using only a small
history of relevant events, allowing latency to scale independently of the
sensor resolution. This change in processing paradigm improved the estimation
of flow directions by up to 73% compared to the existing method and yielded a
demonstrated hARMS throughput of up to 1.21 Mevent/s on the benchmark
configuration selected. This throughput enables real-time performance and makes
it the fastest known realization of aperture-robust, event-based optical flow
to date.
Related papers
- SDformerFlow: Spatiotemporal swin spikeformer for event-based optical flow estimation [10.696635172502141]
Event cameras generate asynchronous and sparse event streams capturing changes in light intensity.
Spiking neural networks (SNNs) share similar asynchronous and sparse characteristics and are well-suited for event cameras.
We propose two solutions for fast and robust optical flow estimation for event cameras: STTFlowNet and SDFlowformer.
arXiv Detail & Related papers (2024-09-06T07:48:18Z) - Fast Window-Based Event Denoising with Spatiotemporal Correlation
Enhancement [85.66867277156089]
We propose window-based event denoising, which simultaneously deals with a stack of events.
In spatial domain, we choose maximum a posteriori (MAP) to discriminate real-world event and noise.
Our algorithm can remove event noise effectively and efficiently and improve the performance of downstream tasks.
arXiv Detail & Related papers (2024-02-14T15:56:42Z) - Neuromorphic Optical Flow and Real-time Implementation with Event
Cameras [47.11134388304464]
We build on the latest developments in event-based vision and spiking neural networks.
We propose a new network architecture that improves the state-of-the-art self-supervised optical flow accuracy.
We demonstrate high speed optical flow prediction with almost two orders of magnitude reduced complexity.
arXiv Detail & Related papers (2023-04-14T14:03:35Z) - Optical flow estimation from event-based cameras and spiking neural
networks [0.4899818550820575]
Event-based sensors are an excellent fit for Spiking Neural Networks (SNNs)
We propose a U-Net-like SNN which, after supervised training, is able to make dense optical flow estimations.
Thanks to separable convolutions, we have been able to develop a light model that can nonetheless yield reasonably accurate optical flow estimates.
arXiv Detail & Related papers (2023-02-13T16:17:54Z) - Fast Event-based Optical Flow Estimation by Triplet Matching [13.298845944779108]
Event cameras offer advantages over traditional cameras (low latency, high dynamic range, low power, etc.)
Optical flow estimation methods that work on packets of events trade off speed for accuracy.
We propose a novel optical flow estimation scheme based on triplet matching.
arXiv Detail & Related papers (2022-12-23T09:12:16Z) - Globally Optimal Event-Based Divergence Estimation for Ventral Landing [55.29096494880328]
Event sensing is a major component in bio-inspired flight guidance and control systems.
We explore the usage of event cameras for predicting time-to-contact with the surface during ventral landing.
This is achieved by estimating divergence (inverse TTC), which is the rate of radial optic flow, from the event stream generated during landing.
Our core contributions are a novel contrast maximisation formulation for event-based divergence estimation, and a branch-and-bound algorithm to exactly maximise contrast and find the optimal divergence value.
arXiv Detail & Related papers (2022-09-27T06:00:52Z) - Real-time Object Detection for Streaming Perception [84.2559631820007]
Streaming perception is proposed to jointly evaluate the latency and accuracy into a single metric for video online perception.
We build a simple and effective framework for streaming perception.
Our method achieves competitive performance on Argoverse-HD dataset and improves the AP by 4.9% compared to the strong baseline.
arXiv Detail & Related papers (2022-03-23T11:33:27Z) - Asynchronous Optimisation for Event-based Visual Odometry [53.59879499700895]
Event cameras open up new possibilities for robotic perception due to their low latency and high dynamic range.
We focus on event-based visual odometry (VO)
We propose an asynchronous structure-from-motion optimisation back-end.
arXiv Detail & Related papers (2022-03-02T11:28:47Z) - FastFlowNet: A Lightweight Network for Fast Optical Flow Estimation [81.76975488010213]
Dense optical flow estimation plays a key role in many robotic vision tasks.
Current networks often occupy large number of parameters and require heavy computation costs.
Our proposed FastFlowNet works in the well-known coarse-to-fine manner with following innovations.
arXiv Detail & Related papers (2021-03-08T03:09:37Z) - Dynamic Resource-aware Corner Detection for Bio-inspired Vision Sensors [0.9988653233188148]
We present an algorithm to detect asynchronous corners from a stream of events in real-time on embedded systems.
The proposed algorithm is capable of selecting the best corner candidate among neighbors and achieves an average execution time savings of 59 %.
arXiv Detail & Related papers (2020-10-29T12:01:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.