GraphAU-Pain: Graph-based Action Unit Representation for Pain Intensity Estimation
- URL: http://arxiv.org/abs/2505.19802v1
- Date: Mon, 26 May 2025 10:35:42 GMT
- Title: GraphAU-Pain: Graph-based Action Unit Representation for Pain Intensity Estimation
- Authors: Zhiyu Wang, Yang Liu, Hatice Gunes,
- Abstract summary: Existing data-driven methods of detecting pain from facial expressions are limited due to interpretability and severity.<n>By utilizing a graph neural network, our framework offers improved interpretability and significant performance gains.<n>Experiments conducted on the publicly available UNBC dataset demonstrate the effectiveness of the GraphAU-Pain.
- Score: 14.267177649888994
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Understanding pain-related facial behaviors is essential for digital healthcare in terms of effective monitoring, assisted diagnostics, and treatment planning, particularly for patients unable to communicate verbally. Existing data-driven methods of detecting pain from facial expressions are limited due to interpretability and severity quantification. To this end, we propose GraphAU-Pain, leveraging a graph-based framework to model facial Action Units (AUs) and their interrelationships for pain intensity estimation. AUs are represented as graph nodes, with co-occurrence relationships as edges, enabling a more expressive depiction of pain-related facial behaviors. By utilizing a relational graph neural network, our framework offers improved interpretability and significant performance gains. Experiments conducted on the publicly available UNBC dataset demonstrate the effectiveness of the GraphAU-Pain, achieving an F1-score of 66.21% and accuracy of 87.61% in pain intensity estimation.
Related papers
- SynPAIN: A Synthetic Dataset of Pain and Non-Pain Facial Expressions [3.0806468055954737]
Existing pain detection datasets suffer from limited ethnic/racial diversity, privacy constraints, and underrepresentation of older adults.<n>We present SynPAIN, a large-scale synthetic dataset containing 10,710 facial expression images.<n>Using commercial generative AI tools, we created demographically balanced synthetic identities with clinically meaningful pain expressions.
arXiv Detail & Related papers (2025-07-25T20:54:04Z) - Beyond Feature Importance: Feature Interactions in Predicting Post-Stroke Rigidity with Graph Explainable AI [12.69689718988924]
Post-stroke rigidity, characterized by increased muscle tone and stiffness, significantly affects survivors' mobility and quality of life.<n>This study addresses the challenge of predicting post-stroke rigidity by emphasizing feature interactions through graph-based explainable AI.
arXiv Detail & Related papers (2025-04-10T22:20:22Z) - Improving Pain Classification using Spatio-Temporal Deep Learning Approaches with Facial Expressions [0.27309692684728604]
Pain management and severity detection are crucial for effective treatment.<n>Traditional self-reporting methods are subjective and may be unsuitable for non-verbal individuals.<n>We explore automated pain detection using facial expressions.
arXiv Detail & Related papers (2025-01-12T11:54:46Z) - Dynamic Graph Enhanced Contrastive Learning for Chest X-ray Report
Generation [92.73584302508907]
We propose a knowledge graph with Dynamic structure and nodes to facilitate medical report generation with Contrastive Learning.
In detail, the fundamental structure of our graph is pre-constructed from general knowledge.
Each image feature is integrated with its very own updated graph before being fed into the decoder module for report generation.
arXiv Detail & Related papers (2023-03-18T03:53:43Z) - Intelligent Sight and Sound: A Chronic Cancer Pain Dataset [74.77784420691937]
This paper introduces the first chronic cancer pain dataset, collected as part of the Intelligent Sight and Sound (ISS) clinical trial.
The data collected to date consists of 29 patients, 509 smartphone videos, 189,999 frames, and self-reported affective and activity pain scores.
Using static images and multi-modal data to predict self-reported pain levels, early models show significant gaps between current methods available to predict pain.
arXiv Detail & Related papers (2022-04-07T22:14:37Z) - Predicting Patient Readmission Risk from Medical Text via Knowledge
Graph Enhanced Multiview Graph Convolution [67.72545656557858]
We propose a new method that uses medical text of Electronic Health Records for prediction.
We represent discharge summaries of patients with multiview graphs enhanced by an external knowledge graph.
Experimental results prove the effectiveness of our method, yielding state-of-the-art performance.
arXiv Detail & Related papers (2021-12-19T01:45:57Z) - U-GAT: Multimodal Graph Attention Network for COVID-19 Outcome
Prediction [31.26241022394112]
During the first wave of COVID-19, hospitals were overwhelmed with the high number of admitted patients.
A holistic graph-based approach combining both imaging and non-imaging information could enable an earlier prognosis.
We introduce a multimodal similarity metric to build a population graph for clustering patients and an image-based end-to-end Graph Attention Network to process this graph.
arXiv Detail & Related papers (2021-07-29T12:00:54Z) - Non-contact Pain Recognition from Video Sequences with Remote
Physiological Measurements Prediction [53.03469655641418]
We present a novel multi-task learning framework which encodes both appearance changes and physiological cues in a non-contact manner for pain recognition.
We establish the state-of-the-art performance of non-contact pain recognition on publicly available pain databases.
arXiv Detail & Related papers (2021-05-18T20:47:45Z) - IA-GCN: Interpretable Attention based Graph Convolutional Network for
Disease prediction [47.999621481852266]
We propose an interpretable graph learning-based model which interprets the clinical relevance of the input features towards the task.
In a clinical scenario, such a model can assist the clinical experts in better decision-making for diagnosis and treatment planning.
Our proposed model shows superior performance with respect to compared methods with an increase in an average accuracy of 3.2% for Tadpole, 1.6% for UKBB Gender, and 2% for the UKBB Age prediction task.
arXiv Detail & Related papers (2021-03-29T13:04:02Z) - Dynamic Graph Correlation Learning for Disease Diagnosis with Incomplete
Labels [66.57101219176275]
Disease diagnosis on chest X-ray images is a challenging multi-label classification task.
We propose a Disease Diagnosis Graph Convolutional Network (DD-GCN) that presents a novel view of investigating the inter-dependency among different diseases.
Our method is the first to build a graph over the feature maps with a dynamic adjacency matrix for correlation learning.
arXiv Detail & Related papers (2020-02-26T17:10:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.