Domain Adaptation Under Behavioral and Temporal Shifts for Natural Time
Series Mobile Activity Recognition
- URL: http://arxiv.org/abs/2207.04367v1
- Date: Sun, 10 Jul 2022 02:48:34 GMT
- Title: Domain Adaptation Under Behavioral and Temporal Shifts for Natural Time
Series Mobile Activity Recognition
- Authors: Garrett Wilson, Janardhan Rao Doppa, Diane J. Cook
- Abstract summary: Existing datasets typically consist of scripted movements.
Our long-term goal is to perform mobile activity recognition in natural settings.
Because of the large variations present in human behavior, we collect data from many participants across two different age groups.
- Score: 31.43183992755392
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Increasingly, human behavior is captured on mobile devices, leading to an
increased interest in automated human activity recognition. However, existing
datasets typically consist of scripted movements. Our long-term goal is to
perform mobile activity recognition in natural settings. We collect a dataset
to support this goal with activity categories that are relevant for downstream
tasks such as health monitoring and intervention. Because of the large
variations present in human behavior, we collect data from many participants
across two different age groups. Because human behavior can change over time,
we also collect data from participants over a month's time to capture the
temporal drift. We hypothesize that mobile activity recognition can benefit
from unsupervised domain adaptation algorithms. To address this need and test
this hypothesis, we analyze the performance of domain adaptation across people
and across time. We then enhance unsupervised domain adaptation with
contrastive learning and with weak supervision when label proportions are
available. The dataset is available at
https://github.com/WSU-CASAS/smartwatch-data
Related papers
- Reconstructing human activities via coupling mobile phone data with
location-based social networks [20.303827107229445]
We propose a data analysis framework to identify user's activity via coupling the mobile phone data with location-based social networks (LBSN) data.
We reconstruct the activity chains of 1,000,000 active mobile phone users and analyze the temporal and spatial characteristics of each activity type.
arXiv Detail & Related papers (2023-06-06T06:37:14Z) - Dataset Bias in Human Activity Recognition [57.91018542715725]
This contribution statistically curates the training data to assess to what degree the physical characteristics of humans influence HAR performance.
We evaluate the performance of a state-of-the-art convolutional neural network on two HAR datasets that vary in the sensors, activities, and recording for time-series HAR.
arXiv Detail & Related papers (2023-01-19T12:33:50Z) - Video-based Pose-Estimation Data as Source for Transfer Learning in
Human Activity Recognition [71.91734471596433]
Human Activity Recognition (HAR) using on-body devices identifies specific human actions in unconstrained environments.
Previous works demonstrated that transfer learning is a good strategy for addressing scenarios with scarce data.
This paper proposes using datasets intended for human-pose estimation as a source for transfer learning.
arXiv Detail & Related papers (2022-12-02T18:19:36Z) - Video Action Detection: Analysing Limitations and Challenges [70.01260415234127]
We analyze existing datasets on video action detection and discuss their limitations.
We perform a biasness study which analyzes a key property differentiating videos from static images: the temporal aspect.
Such extreme experiments show existence of biases which have managed to creep into existing methods inspite of careful modeling.
arXiv Detail & Related papers (2022-04-17T00:42:14Z) - HAR-GCNN: Deep Graph CNNs for Human Activity Recognition From Highly
Unlabeled Mobile Sensor Data [61.79595926825511]
Acquiring balanced datasets containing accurate activity labels requires humans to correctly annotate and potentially interfere with the subjects' normal activities in real-time.
We propose HAR-GCCN, a deep graph CNN model that leverages the correlation between chronologically adjacent sensor measurements to predict the correct labels for unclassified activities.
Har-GCCN shows superior performance relative to previously used baseline methods, improving classification accuracy by about 25% and up to 68% on different datasets.
arXiv Detail & Related papers (2022-03-07T01:23:46Z) - Multi-level Motion Attention for Human Motion Prediction [132.29963836262394]
We study the use of different types of attention, computed at joint, body part, and full pose levels.
Our experiments on Human3.6M, AMASS and 3DPW validate the benefits of our approach for both periodical and non-periodical actions.
arXiv Detail & Related papers (2021-06-17T08:08:11Z) - Diverse Complexity Measures for Dataset Curation in Self-driving [80.55417232642124]
We propose a new data selection method that exploits a diverse set of criteria that quantize interestingness of traffic scenes.
Our experiments show that the proposed curation pipeline is able to select datasets that lead to better generalization and higher performance.
arXiv Detail & Related papers (2021-01-16T23:45:02Z) - Self-Supervised Transformers for Activity Classification using Ambient
Sensors [3.1829446824051195]
This paper proposes a methodology to classify the activities of a resident within an ambient sensor based environment.
We also propose a methodology to pre-train Transformers in a self-supervised manner, as a hybrid autoencoder-classifier model.
arXiv Detail & Related papers (2020-11-22T20:46:25Z) - An Intelligent Non-Invasive Real Time Human Activity Recognition System
for Next-Generation Healthcare [9.793913891417912]
Human motion can be used to provide remote healthcare solutions for vulnerable people.
At present wearable devices can provide real time monitoring by deploying equipment on a person's body.
This paper demonstrates how human motions can be detected in quasi-real-time scenario using a non-invasive method.
arXiv Detail & Related papers (2020-08-06T10:51:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.