Low-power, Continuous Remote Behavioral Localization with Event Cameras
- URL: http://arxiv.org/abs/2312.03799v2
- Date: Tue, 19 Mar 2024 16:08:37 GMT
- Title: Low-power, Continuous Remote Behavioral Localization with Event Cameras
- Authors: Friedhelm Hamann, Suman Ghosh, Ignacio Juarez Martinez, Tom Hart, Alex Kacelnik, Guillermo Gallego,
- Abstract summary: Event cameras offer unique advantages for battery-dependent remote monitoring.
We use this sensor to quantify a behavior in Chinstrap penguins called ecstatic display.
Experiments show that the event cameras' natural response to motion is effective for continuous behavior monitoring and detection.
- Score: 9.107129038623242
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Researchers in natural science need reliable methods for quantifying animal behavior. Recently, numerous computer vision methods emerged to automate the process. However, observing wild species at remote locations remains a challenging task due to difficult lighting conditions and constraints on power supply and data storage. Event cameras offer unique advantages for battery-dependent remote monitoring due to their low power consumption and high dynamic range capabilities. We use this novel sensor to quantify a behavior in Chinstrap penguins called ecstatic display. We formulate the problem as a temporal action detection task, determining the start and end times of the behavior. For this purpose, we recorded a colony of breeding penguins in Antarctica for several weeks and labeled event data on 16 nests. The developed method consists of a generator of candidate time intervals (proposals) and a classifier of the actions within them. The experiments show that the event cameras' natural response to motion is effective for continuous behavior monitoring and detection, reaching a mean average precision (mAP) of 58% (which increases to 63% in good weather conditions). The results also demonstrate the robustness against various lighting conditions contained in the challenging dataset. The low-power capabilities of the event camera allow it to record significantly longer than with a conventional camera. This work pioneers the use of event cameras for remote wildlife observation, opening new interdisciplinary opportunities. https://tub-rip.github.io/eventpenguins/
Related papers
- Fourier-based Action Recognition for Wildlife Behavior Quantification with Event Cameras [9.107129038623242]
We propose approaches to action recognition based on the Fourier Transform.
In particular, we apply our approaches to a recent dataset of breeding penguins annotated for "ecstatic display"
We find that our approaches are both simple and effective, producing slightly lower results than a deep neural network (DNN)
arXiv Detail & Related papers (2024-10-09T09:06:37Z) - EventSleep: Sleep Activity Recognition with Event Cameras [12.584362614255213]
Event cameras are a promising technology for activity recognition in dark environments.
We present EventSleep, a new dataset and methodology to study the suitability of event cameras for a medical application.
arXiv Detail & Related papers (2024-04-02T10:03:23Z) - MISO: Monitoring Inactivity of Single Older Adults at Home using RGB-D Technology [5.612499701087411]
A new application for real-time monitoring of the lack of movement in older adults' own homes is proposed.
A lightweight camera monitoring system was developed and piloted in community homes to observe the daily behavior of older adults.
arXiv Detail & Related papers (2023-11-03T21:51:33Z) - Multimodal Foundation Models for Zero-shot Animal Species Recognition in
Camera Trap Images [57.96659470133514]
Motion-activated camera traps constitute an efficient tool for tracking and monitoring wildlife populations across the globe.
Supervised learning techniques have been successfully deployed to analyze such imagery, however training such techniques requires annotations from experts.
Reducing the reliance on costly labelled data has immense potential in developing large-scale wildlife tracking solutions with markedly less human labor.
arXiv Detail & Related papers (2023-11-02T08:32:00Z) - Event-based Simultaneous Localization and Mapping: A Comprehensive Survey [52.73728442921428]
Review of event-based vSLAM algorithms that exploit the benefits of asynchronous and irregular event streams for localization and mapping tasks.
Paper categorizes event-based vSLAM methods into four main categories: feature-based, direct, motion-compensation, and deep learning methods.
arXiv Detail & Related papers (2023-04-19T16:21:14Z) - A Temporal Densely Connected Recurrent Network for Event-based Human
Pose Estimation [24.367222637492787]
Event camera is an emerging bio-inspired vision sensors that report per-pixel brightness changes asynchronously.
This paper proposes a novel densely connected recurrent architecture to address the problem of incomplete information.
By this recurrent architecture, we can explicitly model not only the sequential but also non-sequential geometric consistency across time steps.
arXiv Detail & Related papers (2022-09-15T04:08:18Z) - Lasers to Events: Automatic Extrinsic Calibration of Lidars and Event
Cameras [67.84498757689776]
This paper presents the first direct calibration method between event cameras and lidars.
It removes dependencies on frame-based camera intermediaries and/or highly-accurate hand measurements.
arXiv Detail & Related papers (2022-07-03T11:05:45Z) - A Preliminary Research on Space Situational Awareness Based on Event
Cameras [8.27218838055049]
Event camera is a new type of sensor that is different from traditional cameras.
The trigger event is the change of the brightness irradiated on the pixel.
Compared with traditional cameras, event cameras have the advantages of high temporal resolution, low latency, high dynamic range, low bandwidth and low power consumption.
arXiv Detail & Related papers (2022-03-24T14:36:18Z) - Bridging the Gap between Events and Frames through Unsupervised Domain
Adaptation [57.22705137545853]
We propose a task transfer method that allows models to be trained directly with labeled images and unlabeled event data.
We leverage the generative event model to split event features into content and motion features.
Our approach unlocks the vast amount of existing image datasets for the training of event-based neural networks.
arXiv Detail & Related papers (2021-09-06T17:31:37Z) - TUM-VIE: The TUM Stereo Visual-Inertial Event Dataset [50.8779574716494]
Event cameras are bio-inspired vision sensors which measure per pixel brightness changes.
They offer numerous benefits over traditional, frame-based cameras, including low latency, high dynamic range, high temporal resolution and low power consumption.
To foster the development of 3D perception and navigation algorithms with event cameras, we present the TUM-VIE dataset.
arXiv Detail & Related papers (2021-08-16T19:53:56Z) - EventHands: Real-Time Neural 3D Hand Reconstruction from an Event Stream [80.15360180192175]
3D hand pose estimation from monocular videos is a long-standing and challenging problem.
We address it for the first time using a single event camera, i.e., an asynchronous vision sensor reacting on brightness changes.
Our approach has characteristics previously not demonstrated with a single RGB or depth camera.
arXiv Detail & Related papers (2020-12-11T16:45:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.