Sensor Data for Human Activity Recognition: Feature Representation and
Benchmarking
- URL: http://arxiv.org/abs/2005.07308v1
- Date: Fri, 15 May 2020 00:46:55 GMT
- Title: Sensor Data for Human Activity Recognition: Feature Representation and
Benchmarking
- Authors: Fl\'avia Alves, Martin Gairing, Frans A. Oliehoek and Thanh-Toan Do
- Abstract summary: The field of Human Activity Recognition (HAR) focuses on obtaining and analysing data captured from monitoring devices (e.g. sensors)
We address the issue of accurately recognising human activities using different Machine Learning (ML) techniques.
- Score: 27.061240686613182
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The field of Human Activity Recognition (HAR) focuses on obtaining and
analysing data captured from monitoring devices (e.g. sensors). There is a wide
range of applications within the field; for instance, assisted living, security
surveillance, and intelligent transportation. In HAR, the development of
Activity Recognition models is dependent upon the data captured by these
devices and the methods used to analyse them, which directly affect performance
metrics. In this work, we address the issue of accurately recognising human
activities using different Machine Learning (ML) techniques. We propose a new
feature representation based on consecutive occurring observations and compare
it against previously used feature representations using a wide range of
classification methods. Experimental results demonstrate that techniques based
on the proposed representation outperform the baselines and a better accuracy
was achieved for both highly and less frequent actions. We also investigate how
the addition of further features and their pre-processing techniques affect
performance results leading to state-of-the-art accuracy on a Human Activity
Recognition dataset.
Related papers
- Consistency Based Weakly Self-Supervised Learning for Human Activity Recognition with Wearables [1.565361244756411]
We describe a weakly self-supervised approach for recognizing human activities from sensor-based data.
We show that our approach can help the clustering algorithm achieve comparable performance in identifying and categorizing the underlying human activities.
arXiv Detail & Related papers (2024-07-29T06:29:21Z) - What Makes Pre-Trained Visual Representations Successful for Robust
Manipulation? [57.92924256181857]
We find that visual representations designed for manipulation and control tasks do not necessarily generalize under subtle changes in lighting and scene texture.
We find that emergent segmentation ability is a strong predictor of out-of-distribution generalization among ViT models.
arXiv Detail & Related papers (2023-11-03T18:09:08Z) - Unsupervised Embedding Learning for Human Activity Recognition Using
Wearable Sensor Data [2.398608007786179]
We present an unsupervised approach to project the human activities into an embedding space in which similar activities will be located closely together.
Results of experiments on three labeled benchmark datasets demonstrate the effectiveness of the framework.
arXiv Detail & Related papers (2023-07-21T08:52:47Z) - A Real-time Human Pose Estimation Approach for Optimal Sensor Placement
in Sensor-based Human Activity Recognition [63.26015736148707]
This paper introduces a novel methodology to resolve the issue of optimal sensor placement for Human Activity Recognition.
The derived skeleton data provides a unique strategy for identifying the optimal sensor location.
Our findings indicate that the vision-based method for sensor placement offers comparable results to the conventional deep learning approach.
arXiv Detail & Related papers (2023-07-06T10:38:14Z) - Human Activity Recognition Using Self-Supervised Representations of
Wearable Data [0.0]
Development of accurate algorithms for human activity recognition (HAR) is hindered by the lack of large real-world labeled datasets.
Here we develop a 6-class HAR model with strong performance when evaluated on real-world datasets not seen during training.
arXiv Detail & Related papers (2023-04-26T07:33:54Z) - Dataset Bias in Human Activity Recognition [57.91018542715725]
This contribution statistically curates the training data to assess to what degree the physical characteristics of humans influence HAR performance.
We evaluate the performance of a state-of-the-art convolutional neural network on two HAR datasets that vary in the sensors, activities, and recording for time-series HAR.
arXiv Detail & Related papers (2023-01-19T12:33:50Z) - Video-based Pose-Estimation Data as Source for Transfer Learning in
Human Activity Recognition [71.91734471596433]
Human Activity Recognition (HAR) using on-body devices identifies specific human actions in unconstrained environments.
Previous works demonstrated that transfer learning is a good strategy for addressing scenarios with scarce data.
This paper proposes using datasets intended for human-pose estimation as a source for transfer learning.
arXiv Detail & Related papers (2022-12-02T18:19:36Z) - Cluster-level pseudo-labelling for source-free cross-domain facial
expression recognition [94.56304526014875]
We propose the first Source-Free Unsupervised Domain Adaptation (SFUDA) method for Facial Expression Recognition (FER)
Our method exploits self-supervised pretraining to learn good feature representations from the target data.
We validate the effectiveness of our method in four adaptation setups, proving that it consistently outperforms existing SFUDA methods when applied to FER.
arXiv Detail & Related papers (2022-10-11T08:24:50Z) - TASKED: Transformer-based Adversarial learning for human activity
recognition using wearable sensors via Self-KnowledgE Distillation [6.458496335718508]
We propose a novel Transformer-based Adversarial learning framework for human activity recognition using wearable sensors via Self-KnowledgE Distillation (TASKED)
In the proposed method, we adopt the teacher-free self-knowledge distillation to improve the stability of the training procedure and the performance of human activity recognition.
arXiv Detail & Related papers (2022-09-14T11:08:48Z) - Human Activity Recognition using Attribute-Based Neural Networks and
Context Information [61.67246055629366]
We consider human activity recognition (HAR) from wearable sensor data in manual-work processes.
We show how context information can be integrated systematically into a deep neural network-based HAR system.
We empirically show that our proposed architecture increases HAR performance, compared to state-of-the-art methods.
arXiv Detail & Related papers (2021-10-28T06:08:25Z) - Contrastive Predictive Coding for Human Activity Recognition [5.766384728949437]
We introduce the Contrastive Predictive Coding framework to human activity recognition, which captures the long-term temporal structure of sensor data streams.
CPC-based pre-training is self-supervised, and the resulting learned representations can be integrated into standard activity chains.
It leads to significantly improved recognition performance when only small amounts of labeled training data are available.
arXiv Detail & Related papers (2020-12-09T21:44:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.