Investigating Event-Based Cameras for Video Frame Interpolation in Sports
- URL: http://arxiv.org/abs/2407.02370v2
- Date: Wed, 3 Jul 2024 08:32:51 GMT
- Title: Investigating Event-Based Cameras for Video Frame Interpolation in Sports
- Authors: Antoine Deckyvere, Anthony Cioppa, Silvio Giancola, Bernard Ghanem, Marc Van Droogenbroeck,
- Abstract summary: We present a first investigation of event-based Video Frame Interpolation (VFI) models for generating sports slow-motion videos.
Particularly, we design and implement a bi-camera recording setup, including an RGB and an event-based camera to capture sports videos, to temporally align and spatially register both cameras.
Our experimental validation demonstrates that TimeLens, an off-the-shelf event-based VFI model, can effectively generate slow-motion footage for sports videos.
- Score: 59.755469098797406
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Slow-motion replays provide a thrilling perspective on pivotal moments within sports games, offering a fresh and captivating visual experience. However, capturing slow-motion footage typically demands high-tech, expensive cameras and infrastructures. Deep learning Video Frame Interpolation (VFI) techniques have emerged as a promising avenue, capable of generating high-speed footage from regular camera feeds. Moreover, the utilization of event-based cameras has recently gathered attention as they provide valuable motion information between frames, further enhancing the VFI performances. In this work, we present a first investigation of event-based VFI models for generating sports slow-motion videos. Particularly, we design and implement a bi-camera recording setup, including an RGB and an event-based camera to capture sports videos, to temporally align and spatially register both cameras. Our experimental validation demonstrates that TimeLens, an off-the-shelf event-based VFI model, can effectively generate slow-motion footage for sports videos. This first investigation underscores the practical utility of event-based cameras in producing sports slow-motion content and lays the groundwork for future research endeavors in this domain.
Related papers
- Cavia: Camera-controllable Multi-view Video Diffusion with View-Integrated Attention [62.2447324481159]
Cavia is a novel framework for camera-controllable, multi-view video generation.
Our framework extends the spatial and temporal attention modules, improving both viewpoint and temporal consistency.
Cavia is the first of its kind that allows the user to specify distinct camera motion while obtaining object motion.
arXiv Detail & Related papers (2024-10-14T17:46:32Z) - Image Conductor: Precision Control for Interactive Video Synthesis [90.2353794019393]
Filmmaking and animation production often require sophisticated techniques for coordinating camera transitions and object movements.
Image Conductor is a method for precise control of camera transitions and object movements to generate video assets from a single image.
arXiv Detail & Related papers (2024-06-21T17:55:05Z) - TimeRewind: Rewinding Time with Image-and-Events Video Diffusion [10.687722181495065]
This paper addresses the novel challenge of rewinding'' time from a single captured image to recover the fleeting moments missed just before the shutter button is pressed.
We overcome this challenge by leveraging the emerging technology of neuromorphic event cameras, which capture motion information with high temporal resolution.
Our proposed framework introduces an event motion adaptor conditioned on event camera data, guiding the diffusion model to generate videos that are visually coherent and physically grounded in the captured events.
arXiv Detail & Related papers (2024-03-20T17:57:02Z) - Dynamic Storyboard Generation in an Engine-based Virtual Environment for
Video Production [92.14891282042764]
We present Virtual Dynamic Storyboard (VDS) to allow users storyboarding shots in virtual environments.
VDS runs on a "propose-simulate-discriminate" mode: Given a formatted story script and a camera script as input, it generates several character animation and camera movement proposals.
To pick up the top-quality dynamic storyboard from the candidates, we equip it with a shot ranking discriminator based on shot quality criteria learned from professional manual-created data.
arXiv Detail & Related papers (2023-01-30T06:37:35Z) - Learning Variational Motion Prior for Video-based Motion Capture [31.79649766268877]
We present a novel variational motion prior (VMP) learning approach for video-based motion capture.
Our framework can effectively reduce temporal jittering and failure modes in frame-wise pose estimation.
Experiments over both public datasets and in-the-wild videos have demonstrated the efficacy and generalization capability of our framework.
arXiv Detail & Related papers (2022-10-27T02:45:48Z) - Stereo Co-capture System for Recording and Tracking Fish with Frame- and
Event Cameras [11.87305195196131]
We introduce a co-capture system for multi-animal visual data acquisition using conventional cameras and event cameras.
Event cameras offer multiple advantages over frame-based cameras, such as a high temporal resolution and temporal redundancy suppression.
We present an event-based multi-animal tracking algorithm, which proves the feasibility of the approach.
arXiv Detail & Related papers (2022-07-15T08:04:10Z) - Video frame interpolation for high dynamic range sequences captured with
dual-exposure sensors [24.086089662881044]
Video frame (VFI) enables many important applications that might involve the temporal domain.
One of the key challenges is handling high dynamic range scenes in the presence of complex motion.
arXiv Detail & Related papers (2022-06-19T20:29:34Z) - TimeReplayer: Unlocking the Potential of Event Cameras for Video
Interpolation [78.99283105497489]
Event camera is a new device to enable video at the presence of arbitrarily complex motion.
This paper proposes a novel TimeReplayer algorithm to interpolate videos captured by commodity cameras with events.
arXiv Detail & Related papers (2022-03-25T18:57:42Z) - Smart Director: An Event-Driven Directing System for Live Broadcasting [110.30675947733167]
Smart Director aims at mimicking the typical human-in-the-loop broadcasting process to automatically create near-professional broadcasting programs in real-time.
Our system is the first end-to-end automated directing system for multi-camera sports broadcasting.
arXiv Detail & Related papers (2022-01-11T16:14:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.