CHARM: A Hierarchical Deep Learning Model for Classification of Complex
Human Activities Using Motion Sensors
- URL: http://arxiv.org/abs/2207.07806v1
- Date: Sat, 16 Jul 2022 01:36:54 GMT
- Title: CHARM: A Hierarchical Deep Learning Model for Classification of Complex
Human Activities Using Motion Sensors
- Authors: Eric Rosen and Doruk Senkal
- Abstract summary: CHARM is a hierarchical deep learning model for classification of complex human activities using motion sensors.
It outperforms state-of-the-art supervised learning approaches for high-level activity recognition in terms of average accuracy and F1 scores.
The ability to learn low-level user activities when trained using only high-level activity labels may pave the way to semi-supervised learning of HAR tasks.
- Score: 0.9594432031144714
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we report a hierarchical deep learning model for
classification of complex human activities using motion sensors. In contrast to
traditional Human Activity Recognition (HAR) models used for event-based
activity recognition, such as step counting, fall detection, and gesture
identification, this new deep learning model, which we refer to as CHARM
(Complex Human Activity Recognition Model), is aimed for recognition of
high-level human activities that are composed of multiple different low-level
activities in a non-deterministic sequence, such as meal preparation, house
chores, and daily routines. CHARM not only quantitatively outperforms
state-of-the-art supervised learning approaches for high-level activity
recognition in terms of average accuracy and F1 scores, but also automatically
learns to recognize low-level activities, such as manipulation gestures and
locomotion modes, without any explicit labels for such activities. This opens
new avenues for Human-Machine Interaction (HMI) modalities using wearable
sensors, where the user can choose to associate an automated task with a
high-level activity, such as controlling home automation (e.g., robotic vacuum
cleaners, lights, and thermostats) or presenting contextually relevant
information at the right time (e.g., reminders, status updates, and
weather/news reports). In addition, the ability to learn low-level user
activities when trained using only high-level activity labels may pave the way
to semi-supervised learning of HAR tasks that are inherently difficult to
label.
Related papers
- Consistency Based Weakly Self-Supervised Learning for Human Activity Recognition with Wearables [1.565361244756411]
We describe a weakly self-supervised approach for recognizing human activities from sensor-based data.
We show that our approach can help the clustering algorithm achieve comparable performance in identifying and categorizing the underlying human activities.
arXiv Detail & Related papers (2024-07-29T06:29:21Z) - Efficient Adaptive Human-Object Interaction Detection with
Concept-guided Memory [64.11870454160614]
We propose an efficient Adaptive HOI Detector with Concept-guided Memory (ADA-CM)
ADA-CM has two operating modes. The first mode makes it tunable without learning new parameters in a training-free paradigm.
Our proposed method achieves competitive results with state-of-the-art on the HICO-DET and V-COCO datasets with much less training time.
arXiv Detail & Related papers (2023-09-07T13:10:06Z) - Robust Activity Recognition for Adaptive Worker-Robot Interaction using
Transfer Learning [0.0]
This paper proposes a transfer learning methodology for activity recognition of construction workers.
The developed algorithm transfers features from a model pre-trained by the original authors and fine-tunes them for the downstream task of activity recognition.
Results indicate that the fine-tuned model can recognize distinct MMH tasks in a robust and adaptive manner.
arXiv Detail & Related papers (2023-08-28T19:03:46Z) - A Matter of Annotation: An Empirical Study on In Situ and Self-Recall Activity Annotations from Wearable Sensors [56.554277096170246]
We present an empirical study that evaluates and contrasts four commonly employed annotation methods in user studies focused on in-the-wild data collection.
For both the user-driven, in situ annotations, where participants annotate their activities during the actual recording process, and the recall methods, where participants retrospectively annotate their data at the end of each day, the participants had the flexibility to select their own set of activity classes and corresponding labels.
arXiv Detail & Related papers (2023-05-15T16:02:56Z) - Classifying Human Activities using Machine Learning and Deep Learning
Techniques [0.0]
Human Activity Recognition (HAR) describes the machines ability to recognize human actions.
Challenge in HAR is to overcome the difficulties of separating human activities based on the given data.
Deep Learning techniques like Long Short-Term Memory (LSTM), Bi-Directional LS classifier, Recurrent Neural Network (RNN), and Gated Recurrent Unit (GRU) are trained.
Experiment results proved that the Linear Support Vector in machine learning and Gated Recurrent Unit in Deep Learning provided better accuracy for human activity recognition.
arXiv Detail & Related papers (2022-05-19T05:20:04Z) - HAR-GCNN: Deep Graph CNNs for Human Activity Recognition From Highly
Unlabeled Mobile Sensor Data [61.79595926825511]
Acquiring balanced datasets containing accurate activity labels requires humans to correctly annotate and potentially interfere with the subjects' normal activities in real-time.
We propose HAR-GCCN, a deep graph CNN model that leverages the correlation between chronologically adjacent sensor measurements to predict the correct labels for unclassified activities.
Har-GCCN shows superior performance relative to previously used baseline methods, improving classification accuracy by about 25% and up to 68% on different datasets.
arXiv Detail & Related papers (2022-03-07T01:23:46Z) - HAKE: A Knowledge Engine Foundation for Human Activity Understanding [65.24064718649046]
Human activity understanding is of widespread interest in artificial intelligence and spans diverse applications like health care and behavior analysis.
We propose a novel paradigm to reformulate this task in two stages: first mapping pixels to an intermediate space spanned by atomic activity primitives, then programming detected primitives with interpretable logic rules to infer semantics.
Our framework, the Human Activity Knowledge Engine (HAKE), exhibits superior generalization ability and performance upon challenging benchmarks.
arXiv Detail & Related papers (2022-02-14T16:38:31Z) - Self-supervised Pretraining with Classification Labels for Temporal
Activity Detection [54.366236719520565]
Temporal Activity Detection aims to predict activity classes per frame.
Due to the expensive frame-level annotations required for detection, the scale of detection datasets is limited.
This work proposes a novel self-supervised pretraining method for detection leveraging classification labels.
arXiv Detail & Related papers (2021-11-26T18:59:28Z) - Human Activity Recognition using Attribute-Based Neural Networks and
Context Information [61.67246055629366]
We consider human activity recognition (HAR) from wearable sensor data in manual-work processes.
We show how context information can be integrated systematically into a deep neural network-based HAR system.
We empirically show that our proposed architecture increases HAR performance, compared to state-of-the-art methods.
arXiv Detail & Related papers (2021-10-28T06:08:25Z) - Contrastive Predictive Coding for Human Activity Recognition [5.766384728949437]
We introduce the Contrastive Predictive Coding framework to human activity recognition, which captures the long-term temporal structure of sensor data streams.
CPC-based pre-training is self-supervised, and the resulting learned representations can be integrated into standard activity chains.
It leads to significantly improved recognition performance when only small amounts of labeled training data are available.
arXiv Detail & Related papers (2020-12-09T21:44:36Z) - HHAR-net: Hierarchical Human Activity Recognition using Neural Networks [2.4530909757679633]
This research aims at building a hierarchical classification with Neural Networks to recognize human activities.
We evaluate our model on the Extrasensory dataset; a dataset collected in the wild and containing data from smartphones and smartwatches.
arXiv Detail & Related papers (2020-10-28T17:06:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.