Mouse Lockbox Dataset: Behavior Recognition for Mice Solving Lockboxes
- URL: http://arxiv.org/abs/2505.15408v3
- Date: Tue, 17 Jun 2025 15:05:18 GMT
- Title: Mouse Lockbox Dataset: Behavior Recognition for Mice Solving Lockboxes
- Authors: Patrik Reiske, Marcus N. Boon, Niek Andresen, Sole Traverso, Katharina Hohlbaum, Lars Lewejohann, Christa Thöne-Reineke, Olaf Hellwich, Henning Sprekeler,
- Abstract summary: We present a video dataset of individual mice solving complex mechanical puzzles, so-called lockboxes.<n>The more than 110 hours of total playtime show their behavior recorded from three different perspectives.<n>As a benchmark for frame-level action classification methods, we provide human-annotated labels for all videos of two different mice.
- Score: 4.227601397207403
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Machine learning and computer vision methods have a major impact on the study of natural animal behavior, as they enable the (semi-)automatic analysis of vast amounts of video data. Mice are the standard mammalian model system in most research fields, but the datasets available today to refine such methods focus either on simple or social behaviors. In this work, we present a video dataset of individual mice solving complex mechanical puzzles, so-called lockboxes. The more than 110 hours of total playtime show their behavior recorded from three different perspectives. As a benchmark for frame-level action classification methods, we provide human-annotated labels for all videos of two different mice, that equal 13% of our dataset. Our keypoint (pose) tracking-based action classification framework illustrates the challenges of automated labeling of fine-grained behaviors, such as the manipulation of objects. We hope that our work will help accelerate the advancement of automated action and behavior classification in the computational neuroscience community. Our dataset is publicly available at https://doi.org/10.14279/depositonce-23850
Related papers
- From Forest to Zoo: Great Ape Behavior Recognition with ChimpBehave [0.0]
We introduce ChimpBehave, a novel dataset featuring over 2 hours of video (approximately 193,000 video frames) of zoo-housed chimpanzees.
ChimpBehave meticulously annotated with bounding boxes and behavior labels for action recognition.
We benchmark our dataset using a state-of-the-art CNN-based action recognition model.
arXiv Detail & Related papers (2024-05-30T13:11:08Z) - BaboonLand Dataset: Tracking Primates in the Wild and Automating Behaviour Recognition from Drone Videos [0.8074955699721389]
This study presents a novel dataset from drone videos for baboon detection, tracking, and behavior recognition.
The baboon detection dataset was created by manually annotating all baboons in drone videos with bounding boxes.
The behavior recognition dataset was generated by converting tracks into mini-scenes, a video subregion centered on each animal.
arXiv Detail & Related papers (2024-05-27T23:09:37Z) - Automated Behavioral Analysis Using Instance Segmentation [2.043437148047176]
Animal behavior analysis plays a crucial role in various fields, such as life science and biomedical research.
The scarcity of available data and the high cost associated with obtaining a large number of labeled datasets pose significant challenges.
We propose a novel approach that leverages instance segmentation-based transfer learning to address these issues.
arXiv Detail & Related papers (2023-12-12T20:36:36Z) - MABe22: A Multi-Species Multi-Task Benchmark for Learned Representations
of Behavior [28.878568752724235]
We introduce MABe22, a benchmark to assess the quality of learned behavior representations.
This dataset is collected from a variety of biology experiments.
We test self-supervised video and trajectory representation learning methods to demonstrate the use of our benchmark.
arXiv Detail & Related papers (2022-07-21T15:51:30Z) - Persistent Animal Identification Leveraging Non-Visual Markers [71.14999745312626]
We aim to locate and provide a unique identifier for each mouse in a cluttered home-cage environment through time.
This is a very challenging problem due to (i) the lack of distinguishing visual features for each mouse, and (ii) the close confines of the scene with constant occlusion.
Our approach achieves 77% accuracy on this animal identification problem, and is able to reject spurious detections when the animals are hidden.
arXiv Detail & Related papers (2021-12-13T17:11:32Z) - Overcoming the Domain Gap in Neural Action Representations [60.47807856873544]
3D pose data can now be reliably extracted from multi-view video sequences without manual intervention.
We propose to use it to guide the encoding of neural action representations together with a set of neural and behavioral augmentations.
To reduce the domain gap, during training, we swap neural and behavioral data across animals that seem to be performing similar actions.
arXiv Detail & Related papers (2021-12-02T12:45:46Z) - The Multi-Agent Behavior Dataset: Mouse Dyadic Social Interactions [39.265388879471686]
We present a multi-agent dataset from behavioral neuroscience, the Caltech Mouse Social Interactions (CalMS21) dataset.
Our dataset consists of trajectory data of social interactions, recorded from videos of freely behaving mice in a standard resident-intruder assay.
The CalMS21 dataset is part of the Multi-Agent Behavior Challenge 2021 and for our next step, our goal is to incorporate datasets from other domains studying multi-agent behavior.
arXiv Detail & Related papers (2021-04-06T17:58:47Z) - AcinoSet: A 3D Pose Estimation Dataset and Baseline Models for Cheetahs
in the Wild [51.35013619649463]
We present an extensive dataset of free-running cheetahs in the wild, called AcinoSet.
The dataset contains 119,490 frames of multi-view synchronized high-speed video footage, camera calibration files and 7,588 human-annotated frames.
The resulting 3D trajectories, human-checked 3D ground truth, and an interactive tool to inspect the data is also provided.
arXiv Detail & Related papers (2021-03-24T15:54:11Z) - Diverse Complexity Measures for Dataset Curation in Self-driving [80.55417232642124]
We propose a new data selection method that exploits a diverse set of criteria that quantize interestingness of traffic scenes.
Our experiments show that the proposed curation pipeline is able to select datasets that lead to better generalization and higher performance.
arXiv Detail & Related papers (2021-01-16T23:45:02Z) - Muti-view Mouse Social Behaviour Recognition with Deep Graphical Model [124.26611454540813]
Social behaviour analysis of mice is an invaluable tool to assess therapeutic efficacy of neurodegenerative diseases.
Because of the potential to create rich descriptions of mouse social behaviors, the use of multi-view video recordings for rodent observations is increasingly receiving much attention.
We propose a novel multiview latent-attention and dynamic discriminative model that jointly learns view-specific and view-shared sub-structures.
arXiv Detail & Related papers (2020-11-04T18:09:58Z) - Learning Predictive Models From Observation and Interaction [137.77887825854768]
Learning predictive models from interaction with the world allows an agent, such as a robot, to learn about how the world works.
However, learning a model that captures the dynamics of complex skills represents a major challenge.
We propose a method to augment the training set with observational data of other agents, such as humans.
arXiv Detail & Related papers (2019-12-30T01:10:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.