An AI-driven multimodal smart home platform for continuous monitoring and intelligent assistance in post-stroke patients
- URL: http://arxiv.org/abs/2411.19000v3
- Date: Tue, 15 Apr 2025 14:35:16 GMT
- Title: An AI-driven multimodal smart home platform for continuous monitoring and intelligent assistance in post-stroke patients
- Authors: Chenyu Tang, Ruizhi Zhang, Shuo Gao, Zihe Zhao, Zibo Zhang, Jiaqi Wang, Cong Li, Junliang Chen, Yanning Dai, Shengbo Wang, Ruoyu Juan, Qiaoying Li, Ruimou Xie, Xuhang Chen, Xinkai Zhou, Yunjia Xia, Jianan Chen, Fanghao Lu, Xin Li, Ninglli Wang, Peter Smielewski, Yu Pan, Hubin Zhao, Luigi G. Occhipinti,
- Abstract summary: We present a smart home platform designed for continuous, at-home rehabilitation of post-stroke patients.<n>A plantar pressure insole classifies users into motor recovery stages with up to 94% accuracy, enabling quantitative tracking of walking patterns.<n>A head-mounted eye-tracking module supports cognitive assessments and hands-free control of household devices.<n>An embedded large language model (LLM) agent, Auto-Care, continuously interprets multimodal data to provide real-time interventions.
- Score: 26.390563099911912
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: At-home rehabilitation for post-stroke patients presents significant challenges, as continuous, personalized care is often limited outside clinical settings. Additionally, the absence of comprehensive solutions addressing diverse monitoring and assistance needs in home environments complicates recovery efforts. Here, we present a multimodal smart home platform designed for continuous, at-home rehabilitation of post-stroke patients, integrating wearable sensing, ambient monitoring, and adaptive automation. A plantar pressure insole equipped with a machine learning pipeline classifies users into motor recovery stages with up to 94% accuracy, enabling quantitative tracking of walking patterns. A head-mounted eye-tracking module supports cognitive assessments and hands-free control of household devices, while ambient sensors ensure sub-second response times for interaction. These data streams are fused locally via a hierarchical Internet of Things (IoT) architecture, protecting privacy and minimizing latency. An embedded large language model (LLM) agent, Auto-Care, continuously interprets multimodal data to provide real-time interventions-issuing personalized reminders, adjusting environmental conditions, and notifying caregivers. Implemented in a post-stroke context, this integrated smart home platform increases overall user satisfaction by an average of 115% (p<0.01) compared to traditional home environment. Beyond stroke, the system offers a scalable framework for patient-centered, long-term care in broader neurorehabilitation and aging-in-place applications.
Related papers
- MELON: Multimodal Mixture-of-Experts with Spectral-Temporal Fusion for Long-Term Mobility Estimation in Critical Care [1.5237145555729716]
We introduce MELON, a novel framework designed to predict 12-hour mobility status in the critical care setting.
We trained and evaluated the MELON model on the multimodal dataset of 126 patients recruited from nine Intensive Care Units at the University of Florida Health Shands Hospital main campus in Gainesville, Florida.
Results showed that MELON outperforms conventional approaches for 12-hour mobility status estimation.
arXiv Detail & Related papers (2025-03-10T19:47:46Z) - Continuous Patient Monitoring with AI: Real-Time Analysis of Video in Hospital Care Settings [0.0]
This study introduces an AI-driven platform for continuous and passive patient monitoring in hospital settings.
The platform provides real-time insights into patient behavior and interactions through video analysis.
The dataset, compiled in collaboration with 11 hospital partners, encompasses over 300 high-risk fall patients.
arXiv Detail & Related papers (2024-12-17T18:23:33Z) - OmniBuds: A Sensory Earable Platform for Advanced Bio-Sensing and On-Device Machine Learning [46.3331254985615]
Sensory earables have evolved from basic audio enhancement devices into sophisticated platforms for clinical-grade health monitoring and wellbeing management.
This paper introduces OmniBuds, an advanced sensory earable platform integrating multiple biosensors and onboard computation powered by a machine learning accelerator.
arXiv Detail & Related papers (2024-10-07T06:30:59Z) - Socially Interactive Agents for Robotic Neurorehabilitation Training: Conceptualization and Proof-of-concept Study [7.365940126473552]
We introduce an AI-based system aimed at delivering personalized, out-of-hospital assistance during neurorehabilitation training.
With the assistance of a professional, the envisioned system is designed to accommodate the unique rehabilitation requirements of an individual patient.
Our approach involves the integration of an interactive socially-aware virtual agent into a neurorehabilitation robotic framework.
arXiv Detail & Related papers (2024-06-17T19:07:05Z) - Towards Privacy-Aware and Personalised Assistive Robots: A User-Centred Approach [55.5769013369398]
This research pioneers user-centric, privacy-aware technologies such as Federated Learning (FL)
FL enables collaborative learning without sharing sensitive data, addressing privacy and scalability issues.
This work includes developing solutions for smart wheelchair assistance, enhancing user independence and well-being.
arXiv Detail & Related papers (2024-05-23T13:14:08Z) - A Telerehabilitation System for the Selection, Evaluation and Remote
Management of Therapies [0.044998333629984864]
The main contribution of this paper is to present, as a whole, all the features supported by the innovative Kinect-based Telerehabilitation System (KiReS)
The knowledge extraction functionality handles knowledge about the physical therapy record of patients and treatment protocols.
The teleimmersion functionality provides a convenient, effective and user-friendly experience when performing the telerehabilitation.
arXiv Detail & Related papers (2024-01-16T08:35:36Z) - Clairvoyance: A Pipeline Toolkit for Medical Time Series [95.22483029602921]
Time-series learning is the bread and butter of data-driven *clinical decision support*
Clairvoyance proposes a unified, end-to-end, autoML-friendly pipeline that serves as a software toolkit.
Clairvoyance is the first to demonstrate viability of a comprehensive and automatable pipeline for clinical time-series ML.
arXiv Detail & Related papers (2023-10-28T12:08:03Z) - A Health Monitoring System Based on Flexible Triboelectric Sensors for
Intelligence Medical Internet of Things and its Applications in Virtual
Reality [4.522609963399036]
The Internet of Medical Things (IoMT) is a platform that combines Internet of Things (IoT) technology with medical applications.
In this study, we designed a robust and intelligent IoMT system through the synergistic integration of flexible wearable triboelectric sensors and deep learning-assisted data analytics.
We embedded four triboelectric sensors into a wristband to detect and analyze limb movements in patients suffering from Parkinson's Disease (PD)
This innovative approach enabled us to accurately capture and scrutinize the subtle movements and fine motor of PD patients, thus providing insightful feedback and comprehensive assessment of the patients conditions.
arXiv Detail & Related papers (2023-09-13T01:01:16Z) - Design, Development, and Evaluation of an Interactive Personalized
Social Robot to Monitor and Coach Post-Stroke Rehabilitation Exercises [68.37238218842089]
We develop an interactive social robot exercise coaching system for personalized rehabilitation.
This system integrates a neural network model with a rule-based model to automatically monitor and assess patients' rehabilitation exercises.
Our system can adapt to new participants and achieved 0.81 average performance to assess their exercises, which is comparable to the experts' agreement level.
arXiv Detail & Related papers (2023-05-12T17:37:04Z) - AI-Enhanced Intensive Care Unit: Revolutionizing Patient Care with Pervasive Sensing [2.7503982558916906]
The intensive care unit (ICU) is a specialized hospital space where critically ill patients receive intensive care and monitoring.
Comprehensive monitoring is imperative in assessing patients conditions, in particular acuity, and ultimately the quality of care.
Currently, visual assessments for acuity, including fine details such as facial expressions, posture, and mobility, are sporadically captured, or not captured at all.
arXiv Detail & Related papers (2023-03-11T00:25:55Z) - Remote patient monitoring using artificial intelligence: Current state,
applications, and challenges [13.516357215412024]
This study aims to do a comprehensive review of RPM systems including adopted advanced technologies, AI impact on RPM, challenges and trends in AI-enabled RPM.
The role of AI in RPM ranges from physical activity classification to chronic disease monitoring and vital signs monitoring in emergency settings.
This review results show that AI-enabled RPM architectures have transformed healthcare monitoring applications.
arXiv Detail & Related papers (2023-01-19T06:22:14Z) - PulseImpute: A Novel Benchmark Task for Pulsative Physiological Signal
Imputation [54.839600943189915]
Mobile Health (mHealth) is the ability to use wearable sensors to monitor participant physiology at high frequencies during daily life to enable temporally-precise health interventions.
Despite a rich imputation literature, existing techniques are ineffective for the pulsative signals which comprise many mHealth applications.
We address this gap with PulseImpute, the first large-scale pulsative signal imputation challenge which includes realistic mHealth missingness models, an extensive set of baselines, and clinically-relevant downstream tasks.
arXiv Detail & Related papers (2022-12-14T21:39:15Z) - Reducing a complex two-sided smartwatch examination for Parkinson's
Disease to an efficient one-sided examination preserving machine learning
accuracy [63.20765930558542]
We have recorded participants performing technology-based assessments in a prospective study to research Parkinson's Disease (PD)
This study provided the largest PD sample size of two-hand synchronous smartwatch measurements.
arXiv Detail & Related papers (2022-05-11T09:12:59Z) - Adherence Forecasting for Guided Internet-Delivered Cognitive Behavioral
Therapy: A Minimally Data-Sensitive Approach [59.535699822923]
Internet-delivered psychological treatments (IDPT) are seen as an effective and scalable pathway to improving the accessibility of mental healthcare.
This work proposes a deep-learning approach to perform automatic adherence forecasting, while relying on minimally sensitive login/logout data.
The proposed Self-Attention Network achieved over 70% average balanced accuracy, when only 1/3 of the treatment duration had elapsed.
arXiv Detail & Related papers (2022-01-11T13:55:57Z) - Optimal discharge of patients from intensive care via a data-driven
policy learning framework [58.720142291102135]
It is important that the patient discharge task addresses the nuanced trade-off between decreasing a patient's length of stay and the risk of readmission or even death following the discharge decision.
This work introduces an end-to-end general framework for capturing this trade-off to recommend optimal discharge timing decisions.
A data-driven approach is used to derive a parsimonious, discrete state space representation that captures a patient's physiological condition.
arXiv Detail & Related papers (2021-12-17T04:39:33Z) - Certainty Modeling of a Decision Support System for Mobile Monitoring of
Exercise induced Respiratory Conditions [0.0]
The aim is to develop a mobile tool to assist patients in managing their conditions.
We present the proposed model architecture and then describe an application scenario in a clinical setting.
arXiv Detail & Related papers (2021-10-15T07:26:36Z) - Personalized Rehabilitation Robotics based on Online Learning Control [62.6606062732021]
We propose a novel online learning control architecture, which is able to personalize the control force at run time to each individual user.
We evaluate our method in an experimental user study, where the learning controller is shown to provide personalized control, while also obtaining safe interaction forces.
arXiv Detail & Related papers (2021-10-01T15:28:44Z) - Early Mobility Recognition for Intensive Care Unit Patients Using
Accelerometers [3.772793938066986]
We propose a new healthcare application of human activity recognition, early mobility recognition for Intensive Care Unit(ICU) patients.
Our system includes accelerometer-based data collection from ICU patients and an AI model to recognize patients' early mobility.
Our results show that our system improves model accuracy from 77.78% to 81.86% and reduces the model instability (standard deviation) from 16.69% to 6.92%.
arXiv Detail & Related papers (2021-06-28T22:59:31Z) - Wearable Health Monitoring System for Older Adults in a Smart Home
Environment [1.14219428942199]
We present the design of a wearable health monitoring system suitable for older adults in a smart home context.
The proposed system offers solutions to monitor the stress, blood pressure, and location of an individual within a smart home environment.
A voice-based prototype is also implemented and the feasibility of the proposed system for integration in a smart home environment is analyzed.
arXiv Detail & Related papers (2021-06-09T03:16:54Z) - The Medkit-Learn(ing) Environment: Medical Decision Modelling through
Simulation [81.72197368690031]
We present a new benchmarking suite designed specifically for medical sequential decision making.
The Medkit-Learn(ing) Environment is a publicly available Python package providing simple and easy access to high-fidelity synthetic medical data.
arXiv Detail & Related papers (2021-06-08T10:38:09Z) - AEGIS: A real-time multimodal augmented reality computer vision based
system to assist facial expression recognition for individuals with autism
spectrum disorder [93.0013343535411]
This paper presents the development of a multimodal augmented reality (AR) system which combines the use of computer vision and deep convolutional neural networks (CNN)
The proposed system, which we call AEGIS, is an assistive technology deployable on a variety of user devices including tablets, smartphones, video conference systems, or smartglasses.
We leverage both spatial and temporal information in order to provide an accurate expression prediction, which is then converted into its corresponding visualization and drawn on top of the original video frame.
arXiv Detail & Related papers (2020-10-22T17:20:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.