Fish Tracking Challenge 2024: A Multi-Object Tracking Competition with Sweetfish Schooling Data
- URL: http://arxiv.org/abs/2409.00339v1
- Date: Sat, 31 Aug 2024 03:26:53 GMT
- Title: Fish Tracking Challenge 2024: A Multi-Object Tracking Competition with Sweetfish Schooling Data
- Authors: Makoto M. Itoh, Qingrui Hu, Takayuki Niizato, Hiroaki Kawashima, Keisuke Fujii,
- Abstract summary: The Fish Tracking Challenge 2024 introduces a multi-object tracking competition focused on the behaviors of schooling sweetfish.
Using the SweetFish dataset, participants are tasked with developing advanced tracking models to accurately monitor the locations of 10 sweetfishes simultaneously.
By leveraging video data and bounding box annotations, the competition aims to foster innovation in automatic detection and tracking algorithms.
- Score: 0.6834295298053009
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: The study of collective animal behavior, especially in aquatic environments, presents unique challenges and opportunities for understanding movement and interaction patterns in the field of ethology, ecology, and bio-navigation. The Fish Tracking Challenge 2024 (https://ftc-2024.github.io/) introduces a multi-object tracking competition focused on the intricate behaviors of schooling sweetfish. Using the SweetFish dataset, participants are tasked with developing advanced tracking models to accurately monitor the locations of 10 sweetfishes simultaneously. This paper introduces the competition's background, objectives, the SweetFish dataset, and the appraoches of the 1st to 3rd winners and our baseline. By leveraging video data and bounding box annotations, the competition aims to foster innovation in automatic detection and tracking algorithms, addressing the complexities of aquatic animal movements. The challenge provides the importance of multi-object tracking for discovering the dynamics of collective animal behavior, with the potential to significantly advance scientific understanding in the above fields.
Related papers
- V3Det Challenge 2024 on Vast Vocabulary and Open Vocabulary Object Detection: Methods and Results [142.5704093410454]
The V3Det Challenge 2024 aims to push the boundaries of object detection research.
The challenge consists of two tracks: Vast Vocabulary Object Detection and Open Vocabulary Object Detection.
We aim to inspire future research directions in vast vocabulary and open-vocabulary object detection.
arXiv Detail & Related papers (2024-06-17T16:58:51Z) - Watching Swarm Dynamics from Above: A Framework for Advanced Object Tracking in Drone Videos [2.2159863221761165]
We propose a novel approach for tracking schools of fish in the open ocean from drone videos.
Our framework not only performs classical object tracking in 2D, instead it tracks the position and spatial expansion of the fish school in world coordinates by fusing video data and the drone's on board sensor information (GPS and IMU)
The presented framework for the first time allows researchers to study collective behavior of fish schools in its natural social and environmental context in a non-invasive and scalable way.
arXiv Detail & Related papers (2024-06-11T19:57:00Z) - WoodScape Motion Segmentation for Autonomous Driving -- CVPR 2023 OmniCV
Workshop Challenge [2.128156484618108]
WoodScape fisheye motion segmentation challenge for autonomous driving was held as part of the CVPR 2023 Workshop on Omnidirectional Computer Vision.
We provide a detailed analysis on the competition which attracted the participation of 112 global teams and a total of 234 submissions.
arXiv Detail & Related papers (2023-12-31T23:53:50Z) - OmniMotionGPT: Animal Motion Generation with Limited Data [70.35662376853163]
We introduce AnimalML3D, the first text-animal motion dataset with 1240 animation sequences spanning 36 different animal identities.
We are able to generate animal motions with high diversity and fidelity, quantitatively and qualitatively outperforming the results of training human motion generation baselines on animal data.
arXiv Detail & Related papers (2023-11-30T07:14:00Z) - Indiscernible Object Counting in Underwater Scenes [91.86044762367945]
Indiscernible object counting is the goal of which is to count objects that are blended with respect to their surroundings.
We present a large-scale dataset IOCfish5K which contains a total of 5,637 high-resolution images and 659,024 annotated center points.
arXiv Detail & Related papers (2023-04-23T15:09:02Z) - BrackishMOT: The Brackish Multi-Object Tracking Dataset [20.52569822945148]
There exist no publicly available annotated underwater multi-object tracking (MOT) datasets captured in turbid environments.
BrackishMOT consists of 98 sequences captured in the wild. Alongside the novel dataset, we present baseline results by training a state-of-the-art tracker.
We analyse the effects of including synthetic data during training and show that a combination of real and synthetic underwater training data can enhance tracking performance.
arXiv Detail & Related papers (2023-02-21T13:02:36Z) - A Survey of Fish Tracking Techniques Based on Computer Vision [11.994865945394139]
This paper presents a review of the advancements of fish tracking technologies over the past seven years-2023.
It explores diverse fish tracking techniques with an emphasis on fundamental localization and tracking methods.
It also summarizes open-source datasets, evaluation metrics, challenges, and applications in fish tracking research.
arXiv Detail & Related papers (2021-10-06T07:46:35Z) - Unlocking the potential of deep learning for marine ecology: overview,
applications, and outlook [8.3226670069051]
This paper aims to bridge the gap between marine ecologists and computer scientists.
We provide insight into popular deep learning approaches for ecological data analysis in plain language.
We illustrate challenges and opportunities through established and emerging applications of deep learning to marine ecology.
arXiv Detail & Related papers (2021-09-29T21:59:16Z) - Movement Tracks for the Automatic Detection of Fish Behavior in Videos [63.85815474157357]
We offer a dataset of sablefish (Anoplopoma fimbria) startle behaviors in underwater videos, and investigate the use of deep learning (DL) methods for behavior detection on it.
Our proposed detection system identifies fish instances using DL-based frameworks, determines trajectory tracks, derives novel behavior-specific features, and employs Long Short-Term Memory (LSTM) networks to identify startle behavior in sablefish.
arXiv Detail & Related papers (2020-11-28T05:51:19Z) - Self-supervised Object Tracking with Cycle-consistent Siamese Networks [55.040249900677225]
We exploit an end-to-end Siamese network in a cycle-consistent self-supervised framework for object tracking.
We propose to integrate a Siamese region proposal and mask regression network in our tracking framework so that a fast and more accurate tracker can be learned without the annotation of each frame.
arXiv Detail & Related papers (2020-08-03T04:10:38Z) - TAO: A Large-Scale Benchmark for Tracking Any Object [95.87310116010185]
Tracking Any Object dataset consists of 2,907 high resolution videos, captured in diverse environments, which are half a minute long on average.
We ask annotators to label objects that move at any point in the video, and give names to them post factum.
Our vocabulary is both significantly larger and qualitatively different from existing tracking datasets.
arXiv Detail & Related papers (2020-05-20T21:07:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.