Fusing Frame and Event Vision for High-speed Optical Flow for Edge
Application
- URL: http://arxiv.org/abs/2207.10720v1
- Date: Thu, 21 Jul 2022 19:15:05 GMT
- Title: Fusing Frame and Event Vision for High-speed Optical Flow for Edge
Application
- Authors: Ashwin Sanjay Lele, Arijit Raychowdhury
- Abstract summary: Event cameras provide continuous asynchronous event streams overcoming the frame-rate limitation.
We fuse the complementary accuracy and speed advantages of the frame and event-based pipelines to provide high-speed optical flow.
- Score: 2.048335092363435
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Optical flow computation with frame-based cameras provides high accuracy but
the speed is limited either by the model size of the algorithm or by the frame
rate of the camera. This makes it inadequate for high-speed applications. Event
cameras provide continuous asynchronous event streams overcoming the frame-rate
limitation. However, the algorithms for processing the data either borrow frame
like setup limiting the speed or suffer from lower accuracy. We fuse the
complementary accuracy and speed advantages of the frame and event-based
pipelines to provide high-speed optical flow while maintaining a low error
rate. Our bio-mimetic network is validated with the MVSEC dataset showing 19%
error degradation at 4x speed up. We then demonstrate the system with a
high-speed drone flight scenario where a high-speed event camera computes the
flow even before the optical camera sees the drone making it suited for
applications like tracking and segmentation. This work shows the fundamental
trade-offs in frame-based processing may be overcome by fusing data from other
modalities.
Related papers
- EventAid: Benchmarking Event-aided Image/Video Enhancement Algorithms
with Real-captured Hybrid Dataset [55.12137324648253]
Event cameras are emerging imaging technology that offers advantages over conventional frame-based imaging sensors in dynamic range and sensing speed.
This paper focuses on five event-aided image and video enhancement tasks.
arXiv Detail & Related papers (2023-12-13T15:42:04Z) - Event-based Background-Oriented Schlieren [18.2247510082534]
Schlieren imaging is an optical technique to observe the flow of transparent media, such as air or water, without any particle seeding.
Event cameras offer potential advantages (high dynamic range, high temporal resolution, and data efficiency) to overcome such limitations due to their bio-inspired sensing principle.
This paper presents a novel technique for perceiving air convection using events and frames by providing the first theoretical analysis that connects event data and schlieren.
arXiv Detail & Related papers (2023-11-01T10:57:20Z) - Towards Anytime Optical Flow Estimation with Event Cameras [35.685866753715416]
Event cameras are capable of responding to log-brightness changes in microseconds.
Existing datasets collected via event cameras provide limited frame rate optical flow ground truth.
We propose EVA-Flow, an EVent-based Anytime Flow estimation network to produce high-frame-rate event optical flow.
arXiv Detail & Related papers (2023-07-11T06:15:12Z) - EvConv: Fast CNN Inference on Event Camera Inputs For High-Speed Robot
Perception [1.3869227429939426]
Event cameras capture visual information with a high temporal resolution and a wide dynamic range.
Current convolutional neural network inference on event camera streams cannot currently perform real-time inference at the high speeds at which event cameras operate.
This paper presents EvConv, a new approach to enable fast inference on CNNs for inputs from event cameras.
arXiv Detail & Related papers (2023-03-08T15:47:13Z) - ESL: Event-based Structured Light [62.77144631509817]
Event cameras are bio-inspired sensors providing significant advantages over standard cameras.
We propose a novel structured-light system using an event camera to tackle the problem of accurate and high-speed depth sensing.
arXiv Detail & Related papers (2021-11-30T15:47:39Z) - SCFlow: Optical Flow Estimation for Spiking Camera [50.770803466875364]
Spiking camera has enormous potential in real applications, especially for motion estimation in high-speed scenes.
Optical flow estimation has achieved remarkable success in image-based and event-based vision, but % existing methods cannot be directly applied in spike stream from spiking camera.
This paper presents, SCFlow, a novel deep learning pipeline for optical flow estimation for spiking camera.
arXiv Detail & Related papers (2021-10-08T06:16:45Z) - TimeLens: Event-based Video Frame Interpolation [54.28139783383213]
We introduce Time Lens, a novel indicates equal contribution method that leverages the advantages of both synthesis-based and flow-based approaches.
We show an up to 5.21 dB improvement in terms of PSNR over state-of-the-art frame-based and event-based methods.
arXiv Detail & Related papers (2021-06-14T10:33:47Z) - Motion-blurred Video Interpolation and Extrapolation [72.3254384191509]
We present a novel framework for deblurring, interpolating and extrapolating sharp frames from a motion-blurred video in an end-to-end manner.
To ensure temporal coherence across predicted frames and address potential temporal ambiguity, we propose a simple, yet effective flow-based rule.
arXiv Detail & Related papers (2021-03-04T12:18:25Z) - FLAVR: Flow-Agnostic Video Representations for Fast Frame Interpolation [97.99012124785177]
FLAVR is a flexible and efficient architecture that uses 3D space-time convolutions to enable end-to-end learning and inference for video framesupervised.
We demonstrate that FLAVR can serve as a useful self- pretext task for action recognition, optical flow estimation, and motion magnification.
arXiv Detail & Related papers (2020-12-15T18:59:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.