Locating Tennis Ball Impact on the Racket in Real Time Using an Event Camera
- URL: http://arxiv.org/abs/2506.08327v1
- Date: Tue, 10 Jun 2025 01:29:32 GMT
- Title: Locating Tennis Ball Impact on the Racket in Real Time Using an Event Camera
- Authors: Yuto Kase, Kai Ishibe, Ryoma Yasuda, Yudai Washida, Sakiko Hashimoto,
- Abstract summary: Event cameras efficiently measure brightness changes (called events') with microsecond computation accuracy under high-speed motion.<n>Our method consists of three identification steps: time range of swing, timing at impact, and contours of ball and racket.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: In racket sports, such as tennis, locating the ball's position at impact is important in clarifying player and equipment characteristics, thereby aiding in personalized equipment design. High-speed cameras are used to measure the impact location; however, their excessive memory consumption limits prolonged scene capture, and manual digitization for position detection is time-consuming and prone to human error. These limitations make it difficult to effectively capture the entire playing scene, hindering the ability to analyze the player's performance. We propose a method for locating the tennis ball impact on the racket in real time using an event camera. Event cameras efficiently measure brightness changes (called `events') with microsecond accuracy under high-speed motion while using lower memory consumption. These cameras enable users to continuously monitor their performance over extended periods. Our method consists of three identification steps: time range of swing, timing at impact, and contours of ball and racket. Conventional computer vision techniques are utilized along with an original event-based processing to detect the timing at impact (PATS: the amount of polarity asymmetry in time symmetry). The results of the experiments were within the permissible range for measuring tennis players' performance. Moreover, the computation time was sufficiently short for real-time applications.
Related papers
- Egocentric Event-Based Vision for Ping Pong Ball Trajectory Prediction [17.147140984254655]
We present a real-time egocentric trajectory prediction system for table tennis using event cameras.<n>We collect a dataset of ping-pong game sequences, including 3D ground-truth trajectories of the ball, synchronized with sensor data from the Meta Project Aria glasses.<n>Our detection pipeline has a worst-case total latency of 4.5 ms, including computation and perception.
arXiv Detail & Related papers (2025-06-09T15:22:55Z) - An Event-Based Perception Pipeline for a Table Tennis Robot [12.101426862186072]
We present the first real-time perception pipeline for a table tennis robot that uses only event-based cameras.<n>We show that compared to a frame-based pipeline, event-based perception pipelines have an update rate which is an order of magnitude higher.
arXiv Detail & Related papers (2025-02-02T10:56:37Z) - BroadTrack: Broadcast Camera Tracking for Soccer [6.011159943695013]
Camera calibration and localization enables many applications in the context of soccer broadcasting.<n>We present a system capable of addressing the task of soccer broadcast camera tracking efficiently, robustly, and accurately.<n>Our tracking system, BroadTrack, halves the mean reprojection error rate and gains more than 15% in terms of Jaccard index for camera calibration on the SoccerNet dataset.
arXiv Detail & Related papers (2024-12-02T17:10:52Z) - Event-Based Tracking Any Point with Motion-Augmented Temporal Consistency [58.719310295870024]
This paper presents an event-based framework for tracking any point.<n>It tackles the challenges posed by spatial sparsity and motion sensitivity in events.<n>It achieves 150% faster processing with competitive model parameters.
arXiv Detail & Related papers (2024-12-02T09:13:29Z) - Deblur e-NeRF: NeRF from Motion-Blurred Events under High-speed or Low-light Conditions [56.84882059011291]
We propose Deblur e-NeRF, a novel method to reconstruct blur-minimal NeRFs from motion-red events.
We also introduce a novel threshold-normalized total variation loss to improve the regularization of large textureless patches.
arXiv Detail & Related papers (2024-09-26T15:57:20Z) - Investigating Event-Based Cameras for Video Frame Interpolation in Sports [59.755469098797406]
We present a first investigation of event-based Video Frame Interpolation (VFI) models for generating sports slow-motion videos.
Particularly, we design and implement a bi-camera recording setup, including an RGB and an event-based camera to capture sports videos, to temporally align and spatially register both cameras.
Our experimental validation demonstrates that TimeLens, an off-the-shelf event-based VFI model, can effectively generate slow-motion footage for sports videos.
arXiv Detail & Related papers (2024-07-02T15:39:08Z) - Table tennis ball spin estimation with an event camera [11.735290341808064]
In table tennis, the combination of high velocity and spin renders traditional low frame rate cameras inadequate.
We present the first method for table tennis spin estimation using an event camera.
We achieve a spin magnitude mean error of $10.7 pm 17.3$ rps and a spin axis mean error of $32.9 pm 38.2deg$ in real time for a flying ball.
arXiv Detail & Related papers (2024-04-15T15:36:38Z) - EV-Catcher: High-Speed Object Catching Using Low-latency Event-based
Neural Networks [107.62975594230687]
We demonstrate an application where event cameras excel: accurately estimating the impact location of fast-moving objects.
We introduce a lightweight event representation called Binary Event History Image (BEHI) to encode event data at low latency.
We show that the system is capable of achieving a success rate of 81% in catching balls targeted at different locations, with a velocity of up to 13 m/s even on compute-constrained embedded platforms.
arXiv Detail & Related papers (2023-04-14T15:23:28Z) - Lasers to Events: Automatic Extrinsic Calibration of Lidars and Event
Cameras [67.84498757689776]
This paper presents the first direct calibration method between event cameras and lidars.
It removes dependencies on frame-based camera intermediaries and/or highly-accurate hand measurements.
arXiv Detail & Related papers (2022-07-03T11:05:45Z) - ESL: Event-based Structured Light [62.77144631509817]
Event cameras are bio-inspired sensors providing significant advantages over standard cameras.
We propose a novel structured-light system using an event camera to tackle the problem of accurate and high-speed depth sensing.
arXiv Detail & Related papers (2021-11-30T15:47:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.