kabr-tools: Automated Framework for Multi-Species Behavioral Monitoring
- URL: http://arxiv.org/abs/2510.02030v2
- Date: Wed, 22 Oct 2025 03:57:21 GMT
- Title: kabr-tools: Automated Framework for Multi-Species Behavioral Monitoring
- Authors: Jenna Kline, Maksim Kholiavchenko, Samuel Stevens, Nina van Tiel, Alison Zhong, Namrata Banerji, Alec Sheets, Sowbaranika Balasubramaniam, Isla Duporge, Matthew Thompson, Elizabeth Campolongo, Jackson Miliko, Neil Rosser, Tanya Berger-Wolf, Charles V. Stewart, Daniel I. Rubenstein,
- Abstract summary: We present kabr-tools, an open-source package for automated multi-species behavioral monitoring.<n>This framework integrates drone-based video with machine learning systems to extract behavioral, social, and spatial metrics from wildlife footage.<n>Compared to ground-based methods, drone-based observations significantly improved behavioral granularity, reducing visibility loss by 15%.
- Score: 4.303185550812535
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: A comprehensive understanding of animal behavior ecology depends on scalable approaches to quantify and interpret complex, multidimensional behavioral patterns. Traditional field observations are often limited in scope, time-consuming, and labor-intensive, hindering the assessment of behavioral responses across landscapes. To address this, we present kabr-tools (Kenyan Animal Behavior Recognition Tools), an open-source package for automated multi-species behavioral monitoring. This framework integrates drone-based video with machine learning systems to extract behavioral, social, and spatial metrics from wildlife footage. Our pipeline leverages object detection, tracking, and behavioral classification systems to generate key metrics, including time budgets, behavioral transitions, social interactions, habitat associations, and group composition dynamics. Compared to ground-based methods, drone-based observations significantly improved behavioral granularity, reducing visibility loss by 15% and capturing more transitions with higher accuracy and continuity. We validate kabr-tools through three case studies, analyzing 969 behavioral sequences, surpassing the capacity of traditional methods for data capture and annotation. We found that, like Plains zebras, vigilance in Grevy's zebras decreases with herd size, but, unlike Plains zebras, habitat has a negligible impact. Plains and Grevy's zebras exhibit strong behavioral inertia, with rare transitions to alert behaviors and observed spatial segregation between Grevy's zebras, Plains zebras, and giraffes in mixed-species herds. By enabling automated behavioral monitoring at scale, kabr-tools offers a powerful tool for ecosystem-wide studies, advancing conservation, biodiversity research, and ecological monitoring.
Related papers
- Decentralized Vision-Based Autonomous Aerial Wildlife Monitoring [55.159556673975544]
We propose a decentralized vision-based multi-quadrotor system for wildlife monitoring.<n>Our approach enables robust identification and tracking of large species in their natural habitat.
arXiv Detail & Related papers (2025-08-20T20:05:05Z) - A Review on Coarse to Fine-Grained Animal Action Recognition [23.001797172183345]
Review explores the field of animal action recognition, focusing on coarse-grained (FGCG) and fine-grained (FGG) techniques.<n>Examines the current state of research in animal behaviour recognition and to elucidate the unique challenges associated with recognising subtle animal actions in outdoor environments.<n>Review outlines future directions for advancing fine-grained action recognition, aiming to improve accuracy and generalisability in behaviour analysis across species.
arXiv Detail & Related papers (2025-06-01T23:31:25Z) - BioCLIP 2: Emergent Properties from Scaling Hierarchical Contrastive Learning [51.341003735575335]
We find emergent behaviors in biological vision models via large-scale contrastive vision-language training.<n>We train BioCLIP 2 on TreeOfLife-200M to distinguish different species.<n>We identify emergent properties in the learned embedding space of BioCLIP 2.
arXiv Detail & Related papers (2025-05-29T17:48:20Z) - MammAlps: A multi-view video behavior monitoring dataset of wild mammals in the Swiss Alps [41.58000025132071]
MammAlps is a dataset of wildlife behavior monitoring from 9 camera-traps in the Swiss National Park.<n>Based on 6135 single animal clips, we propose the first hierarchical and multimodal animal behavior recognition benchmark.<n>We also propose a second ecology-oriented benchmark aiming at identifying activities, species, number of individuals and meteorological conditions.
arXiv Detail & Related papers (2025-03-23T21:51:58Z) - Computer Vision for Primate Behavior Analysis in the Wild [61.08941894580172]
Video-based behavioral monitoring has great potential for transforming how we study animal cognition and behavior.
There is still a fairly large gap between the exciting prospects and what can actually be achieved in practice today.
arXiv Detail & Related papers (2024-01-29T18:59:56Z) - Behaviour Modelling of Social Animals via Causal Structure Discovery and
Graph Neural Networks [15.542220566525021]
We propose a method to build behavioural models using causal structure discovery and graph neural networks for time series.
We apply this method to a mob of meerkats in a zoo environment and study its ability to predict future actions.
arXiv Detail & Related papers (2023-12-21T23:34:08Z) - Livestock feeding behaviour: A review on automated systems for ruminant monitoring [33.7054351451505]
This paper is the first tutorial-style review on the analysis of the feeding behaviour of ruminants.
It assesses the main sensing methodologies and the main techniques to measure and analyse the signals associated with feeding behaviour.
It also highlights the potentiality of automated monitoring systems to provide valuable information.
arXiv Detail & Related papers (2023-12-03T13:42:55Z) - Livestock Monitoring with Transformer [4.298326853567677]
We develop an end-to-end behaviour monitoring system for group-housed pigs to perform simultaneous instance level segmentation, tracking, action recognition and re-identification tasks.
We present starformer, the first end-to-end multiple-object livestock monitoring framework that learns instance-level embeddings for grouped pigs through the use of transformer architecture.
arXiv Detail & Related papers (2021-11-01T10:03:49Z) - Intersection Regularization for Extracting Semantic Attributes [72.53481390411173]
We consider the problem of supervised classification, such that the features that the network extracts match an unseen set of semantic attributes.
For example, when learning to classify images of birds into species, we would like to observe the emergence of features that zoologists use to classify birds.
We propose training a neural network with discrete top-level activations, which is followed by a multi-layered perceptron (MLP) and a parallel decision tree.
arXiv Detail & Related papers (2021-03-22T14:32:44Z) - A Trainable Optimal Transport Embedding for Feature Aggregation and its
Relationship to Attention [96.77554122595578]
We introduce a parametrized representation of fixed size, which embeds and then aggregates elements from a given input set according to the optimal transport plan between the set and a trainable reference.
Our approach scales to large datasets and allows end-to-end training of the reference, while also providing a simple unsupervised learning mechanism with small computational cost.
arXiv Detail & Related papers (2020-06-22T08:35:58Z) - Automatic image-based identification and biomass estimation of
invertebrates [70.08255822611812]
Time-consuming sorting and identification of taxa pose strong limitations on how many insect samples can be processed.
We propose to replace the standard manual approach of human expert-based sorting and identification with an automatic image-based technology.
We use state-of-the-art Resnet-50 and InceptionV3 CNNs for the classification task.
arXiv Detail & Related papers (2020-02-05T21:38:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.