The modular pipeline processes drone video through object detection, individual tracking, and machine learning-based behavioral classification to generate ecological metrics including time budgets, behavioral transitions, land use and habitat, social interactions, and demographic data. Framework design enables integration of novel ML models and adaptation across species and study systems.
Click to expand detailed description
Understanding community-level ecological patterns requires scalable methods to process multi-dimensional behavioral data. Traditional field observations are limited in scope, making it difficult to assess behavioral responses across landscapes. To address this, we present Kenyan Animal Behavior Recognition, kabr-tools. This open-source computational ecology framework integrates drone-based video with machine learning to automatically extract behavioral, social, and spatial metrics from wildlife footage.
Our pipeline processes multi-species drone data using object detection, tracking, and behavioral classification to generate five key metrics: time budgets, behavioral transitions, social interactions, habitat associations, and group composition dynamics. Validated on three African species, our system achieved 65 - 70% behavioral classification accuracy, with >95%.
Figure 1: kabr-tools computational framework for automated wildlife behavioral monitoring.
kabr-tools
requires Python 3.10 or 3.11.
pip install -e .
-
Behavior classification with SlowFast:
pip install -e ".[slowfast]"
-
Development tools:
pip install -e ".[dev]"
-
Documentation tools:
pip install -e ".[docs]"
-
All-in-one (recommended for contributors):
pip install -e ".[slowfast,dev,docs]"
Note: For CUDA-specific
torch
versions, refer to pytorch.org.
The KABR tools pipeline consists of:
- Drone-based video collection
- Annotation and tracking using CVAT
- Mini-scene extraction
- Behavior classification
- Ecological analysis
Each script can be run via command-line (<tool-name> -h
) or imported as a module.
See Methodology Comparison for a detailed comparison of different methods for animal behavior analysis, including:
- Focal sampling
- Scan sampling
- Drone-based video analysis
See Case Studies for case studies demonstrating the application of kabr-tools in various ecological contexts, including:
- Grevy's zebra time budgets
- Mixed-species social interactions
Example time budget analysis output comparing behavior granularity collected from drone videos versus traditional field observations.
Figure 2: Time budget comparison between drone-based behavior classification (bottom) and manual field focal observations (top).
Please refer to our papers for details on the data collection process and machine learning model development.
- KABR: In-Situ Dataset for Kenyan Animal Behavior Recognition
- Deep dive into KABR: a dataset for understanding ungulate behavior from in-situ drone video
- Integrating Biological Data into Autonomous Remote Sensing Systems for In Situ Imageomics: A Case Study for Kenyan Animal Behavior Sensing with Unmanned Aerial Vehicles (UAVs)
- A Framework for Autonomic Computing for In Situ Imageomics
If you use this toolkit in your research, please cite both this package and the associated paper (kabr-tools: Automated Framework for Multi-Species Behavioral Monitoring):
Package citation:
@software{kabr-tools,
author = {Kline, Jenna and Zhong, Alison and Campolongo, Elizabeth and Kholiavchenko, Maksim},
title = {kabr-tools: Tools for annotating animal behavior in drone videos},
version = {3.0.0},
year = {2025},
doi = {10.5281/zenodo.11288083},
url = {https://github.com/Imageomics/kabr-tools}
}
Paper citation:
@misc{kline2025kabrtoolsautomatedframeworkmultispecies,
title={kabr-tools: Automated Framework for Multi-Species Behavioral Monitoring},
author={Jenna Kline and Maksim Kholiavchenko and Samuel Stevens and Nina van Tiel and Alison Zhong and Namrata Banerji and Alec Sheets and Sowbaranika Balasubramaniam and Isla Duporge and Matthew Thompson and Elizabeth Campolongo and Jackson Miliko and Neil Rosser and Tanya Berger-Wolf and Charles V. Stewart and Daniel I. Rubenstein},
year={2025},
eprint={2510.02030},
archivePrefix={arXiv},
primaryClass={cs.CV},
url={https://arxiv.org/abs/2510.02030},
}
Open issues on GitHub.
This project is licensed under the MIT License.