UBIWEAR: An end-to-end, data-driven framework for intelligent physical
activity prediction to empower mHealth interventions
- URL: http://arxiv.org/abs/2212.14731v2
- Date: Tue, 3 Jan 2023 15:43:24 GMT
- Title: UBIWEAR: An end-to-end, data-driven framework for intelligent physical
activity prediction to empower mHealth interventions
- Authors: Asterios Bampakis, Sofia Yfantidou, Athena Vakali
- Abstract summary: UBIWEAR is an end-to-end framework for intelligent physical activity prediction.
Our best model achieves a MAE of 1087 steps, 65% lower than the state of the art in terms of absolute error.
- Score: 3.4483987421251516
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: It is indisputable that physical activity is vital for an individual's health
and wellness. However, a global prevalence of physical inactivity has induced
significant personal and socioeconomic implications. In recent years, a
significant amount of work has showcased the capabilities of self-tracking
technology to create positive health behavior change. This work is motivated by
the potential of personalized and adaptive goal-setting techniques in
encouraging physical activity via self-tracking. To this end, we propose
UBIWEAR, an end-to-end framework for intelligent physical activity prediction,
with the ultimate goal to empower data-driven goal-setting interventions. To
achieve this, we experiment with numerous machine learning and deep learning
paradigms as a robust benchmark for physical activity prediction tasks. To
train our models, we utilize, "MyHeart Counts", an open, large-scale dataset
collected in-the-wild from thousands of users. We also propose a prescriptive
framework for self-tracking aggregated data preprocessing, to facilitate data
wrangling of real-world, noisy data. Our best model achieves a MAE of 1087
steps, 65% lower than the state of the art in terms of absolute error, proving
the feasibility of the physical activity prediction task, and paving the way
for future research.
Related papers
- Physical formula enhanced multi-task learning for pharmacokinetics prediction [54.13787789006417]
A major challenge for AI-driven drug discovery is the scarcity of high-quality data.
We develop a formula enhanced mul-ti-task learning (PEMAL) method that predicts four key parameters of pharmacokinetics simultaneously.
Our experiments reveal that PEMAL significantly lowers the data demand, compared to typical Graph Neural Networks.
arXiv Detail & Related papers (2024-04-16T07:42:55Z) - Integrating Wearable Sensor Data and Self-reported Diaries for Personalized Affect Forecasting [2.36325543943271]
We propose a multimodal deep learning model for affect status forecasting.
This model combines a transformer encoder with a pre-trained language model, facilitating the integrated analysis of objective metrics and self-reported diaries.
Our results demonstrate that the proposed model achieves predictive accuracy of 82.50% for positive affect and 82.76% for negative affect, a full week in advance.
arXiv Detail & Related papers (2024-03-16T17:24:38Z) - Exploring Model Transferability through the Lens of Potential Energy [78.60851825944212]
Transfer learning has become crucial in computer vision tasks due to the vast availability of pre-trained deep learning models.
Existing methods for measuring the transferability of pre-trained models rely on statistical correlations between encoded static features and task labels.
We present an insightful physics-inspired approach named PED to address these challenges.
arXiv Detail & Related papers (2023-08-29T07:15:57Z) - Enhancing Activity Prediction Models in Drug Discovery with the Ability
to Understand Human Language [5.117101148161245]
We envision a novel type of activity prediction model that is able to adapt to new prediction tasks at inference time.
Our method CLAMP yields improved predictive performance on few-shot learning benchmarks and zero-shot problems in drug discovery.
arXiv Detail & Related papers (2023-03-06T18:49:09Z) - Dataset Bias in Human Activity Recognition [57.91018542715725]
This contribution statistically curates the training data to assess to what degree the physical characteristics of humans influence HAR performance.
We evaluate the performance of a state-of-the-art convolutional neural network on two HAR datasets that vary in the sensors, activities, and recording for time-series HAR.
arXiv Detail & Related papers (2023-01-19T12:33:50Z) - A Neural Active Inference Model of Perceptual-Motor Learning [62.39667564455059]
The active inference framework (AIF) is a promising new computational framework grounded in contemporary neuroscience.
In this study, we test the ability for the AIF to capture the role of anticipation in the visual guidance of action in humans.
We present a novel formulation of the prior function that maps a multi-dimensional world-state to a uni-dimensional distribution of free-energy.
arXiv Detail & Related papers (2022-11-16T20:00:38Z) - Learn to Predict How Humans Manipulate Large-sized Objects from
Interactive Motions [82.90906153293585]
We propose a graph neural network, HO-GCN, to fuse motion data and dynamic descriptors for the prediction task.
We show the proposed network that consumes dynamic descriptors can achieve state-of-the-art prediction results and help the network better generalize to unseen objects.
arXiv Detail & Related papers (2022-06-25T09:55:39Z) - Objective Prediction of Tomorrow's Affect Using Multi-Modal
Physiological Data and Personal Chronicles: A Study of Monitoring College
Student Well-being in 2020 [0.0]
The goal of our study was to investigate the capacity to more accurately predict affect through a fully automatic and objective approach using multiple commercial devices.
Longitudinal physiological data and daily assessments of emotions were collected from a sample of college students using smart wearables and phones for over a year.
Results showed that our model was able to predict next-day affect with accuracy comparable to state of the art methods.
arXiv Detail & Related papers (2022-01-26T23:06:20Z) - Physion: Evaluating Physical Prediction from Vision in Humans and
Machines [46.19008633309041]
We present a visual and physical prediction benchmark that precisely measures this capability.
We compare an array of algorithms on their ability to make diverse physical predictions.
We find that graph neural networks with access to the physical state best capture human behavior.
arXiv Detail & Related papers (2021-06-15T16:13:39Z) - Self-supervised transfer learning of physiological representations from
free-living wearable data [12.863826659440026]
We present a novel self-supervised representation learning method using activity and heart rate (HR) signals without semantic labels.
We evaluate our model in the largest free-living combined-sensing dataset (comprising >280k hours of wrist accelerometer & wearable ECG data)
arXiv Detail & Related papers (2020-11-18T23:21:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.