Integrating Temporal Context into Streaming Data for Human Activity Recognition in Smart Home
- URL: http://arxiv.org/abs/2601.11611v1
- Date: Fri, 09 Jan 2026 09:47:06 GMT
- Title: Integrating Temporal Context into Streaming Data for Human Activity Recognition in Smart Home
- Authors: Marina Vicini, Martin Rudorfer, Zhuangzhuang Dai, Luis J. Manso,
- Abstract summary: Human Activity Recognition (HAR) from passive sensors mostly relies on traditional machine learning.<n>We tackle this by clustering activities into morning, afternoon, and night.<n>We propose to extend the feature vector by incorporating time of day and day of week as cyclical temporal features.
- Score: 3.1032184155196982
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: With the global population ageing, it is crucial to enable individuals to live independently and safely in their homes. Using ubiquitous sensors such as Passive InfraRed sensors (PIR) and door sensors is drawing increasing interest for monitoring daily activities and facilitating preventative healthcare interventions for the elderly. Human Activity Recognition (HAR) from passive sensors mostly relies on traditional machine learning and includes data segmentation, feature extraction, and classification. While techniques like Sensor Weighting Mutual Information (SWMI) capture spatial context in a feature vector, effectively leveraging temporal information remains a challenge. We tackle this by clustering activities into morning, afternoon, and night, and encoding them into the feature weighting method calculating distinct mutual information matrices. We further propose to extend the feature vector by incorporating time of day and day of week as cyclical temporal features, as well as adding a feature to track the user's location. The experiments show improved accuracy and F1-score over existing state-of-the-art methods in three out of four real-world datasets, with highest gains in a low-data regime. These results highlight the potential of our approach for developing effective smart home solutions to support ageing in place.
Related papers
- Online Segment Any 3D Thing as Instance Tracking [60.20416622842975]
We reconceptualize online 3D segmentation as an instance tracking problem (AutoSeg3D)<n>We introduce spatial consistency learning to mitigate the fragmentation problem inherent in Vision Foundation Models.<n>Our method establishes a new state-of-the-art, surpassing ESAM by 2.8 AP on ScanNet200.
arXiv Detail & Related papers (2025-12-08T14:48:51Z) - MARAuder's Map: Motion-Aware Real-time Activity Recognition with Layout-Based Trajectories [3.788163163289351]
We propose a novel framework for real-time activity recognition from raw, unsegmented sensor streams.<n>Our method projects sensor activations onto the physical floorplan to generate trajectory-aware, image-like sequences.<n>To enhance temporal awareness, we introduce a learnable time embedding module that encodes contextual cues.
arXiv Detail & Related papers (2025-11-08T00:07:43Z) - DISCOVER: Data-driven Identification of Sub-activities via Clustering and Visualization for Enhanced Activity Recognition in Smart Homes [46.86909768552777]
We introduce DISCOVER, a method to discover fine-grained human sub-activities from unlabeled sensor data without relying on pre-segmentation.<n>We demonstrate its effectiveness through a re-annotation exercise on widely used HAR datasets.
arXiv Detail & Related papers (2025-02-11T20:02:24Z) - Scaling Wearable Foundation Models [54.93979158708164]
We investigate the scaling properties of sensor foundation models across compute, data, and model size.
Using a dataset of up to 40 million hours of in-situ heart rate, heart rate variability, electrodermal activity, accelerometer, skin temperature, and altimeter per-minute data from over 165,000 people, we create LSM.
Our results establish the scaling laws of LSM for tasks such as imputation, extrapolation, both across time and sensor modalities.
arXiv Detail & Related papers (2024-10-17T15:08:21Z) - Know Thy Neighbors: A Graph Based Approach for Effective Sensor-Based
Human Activity Recognition in Smart Homes [0.0]
We propose a novel graph-guided neural network approach for Human Activity Recognition (HAR) in smart homes.
We accomplish this by learning a more expressive graph structure representing the sensor network in a smart home.
Our approach maps discrete input sensor measurements to a feature space through the application of attention mechanisms.
arXiv Detail & Related papers (2023-11-16T02:43:13Z) - Spatio-Temporal Branching for Motion Prediction using Motion Increments [55.68088298632865]
Human motion prediction (HMP) has emerged as a popular research topic due to its diverse applications.
Traditional methods rely on hand-crafted features and machine learning techniques.
We propose a noveltemporal-temporal branching network using incremental information for HMP.
arXiv Detail & Related papers (2023-08-02T12:04:28Z) - A Real-time Human Pose Estimation Approach for Optimal Sensor Placement
in Sensor-based Human Activity Recognition [63.26015736148707]
This paper introduces a novel methodology to resolve the issue of optimal sensor placement for Human Activity Recognition.
The derived skeleton data provides a unique strategy for identifying the optimal sensor location.
Our findings indicate that the vision-based method for sensor placement offers comparable results to the conventional deep learning approach.
arXiv Detail & Related papers (2023-07-06T10:38:14Z) - A Spatio-Temporal Multilayer Perceptron for Gesture Recognition [70.34489104710366]
We propose a multilayer state-weighted perceptron for gesture recognition in the context of autonomous vehicles.
An evaluation of TCG and Drive&Act datasets is provided to showcase the promising performance of our approach.
We deploy our model to our autonomous vehicle to show its real-time capability and stable execution.
arXiv Detail & Related papers (2022-04-25T08:42:47Z) - Self-Supervised Transformers for Activity Classification using Ambient
Sensors [3.1829446824051195]
This paper proposes a methodology to classify the activities of a resident within an ambient sensor based environment.
We also propose a methodology to pre-train Transformers in a self-supervised manner, as a hybrid autoencoder-classifier model.
arXiv Detail & Related papers (2020-11-22T20:46:25Z) - Deep ConvLSTM with self-attention for human activity decoding using
wearables [0.0]
We propose a deep neural network architecture that captures features of multiple sensor time-series data but also selects important time points.
We show the validity of the proposed approach across different data sampling strategies and demonstrate that the self-attention mechanism gave a significant improvement.
The proposed methods open avenues for better decoding of human activity from multiple body sensors over extended periods time.
arXiv Detail & Related papers (2020-05-02T04:30:31Z) - Human Activity Recognition from Wearable Sensor Data Using
Self-Attention [2.9023633922848586]
We present a self-attention based neural network model for activity recognition from body-worn sensor data.
We performed experiments on four popular publicly available HAR datasets: PAMAP2, Opportunity, Skoda and USC-HAD.
Our model achieve significant performance improvement over recent state-of-the-art models in both benchmark test subjects and Leave-one-out-subject evaluation.
arXiv Detail & Related papers (2020-03-17T14:16:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.