Interpersonal Relationship Analysis with Dyadic EEG Signals via Learning
Spatial-Temporal Patterns
- URL: http://arxiv.org/abs/2401.03250v1
- Date: Sat, 6 Jan 2024 16:17:58 GMT
- Title: Interpersonal Relationship Analysis with Dyadic EEG Signals via Learning
Spatial-Temporal Patterns
- Authors: Wenqi Ji, Fang liu, Xinxin Du, Niqi Liu, Chao Zhou, Mingjin Yu,
Guozhen Zhao, Yong-Jin Liu
- Abstract summary: We propose a social relationship analysis framework using patterns derived from dyadic EEG signals.
We show that the social relationship type (stranger or friend) between two individuals can be effectively identified through their EEG data.
- Score: 21.082038315707923
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Interpersonal relationship quality is pivotal in social and occupational
contexts. Existing analysis of interpersonal relationships mostly rely on
subjective self-reports, whereas objective quantification remains challenging.
In this paper, we propose a novel social relationship analysis framework using
spatio-temporal patterns derived from dyadic EEG signals, which can be applied
to quantitatively measure team cooperation in corporate team building, and
evaluate interpersonal dynamics between therapists and patients in psychiatric
therapy. First, we constructed a dyadic-EEG dataset from 72 pairs of
participants with two relationships (stranger or friend) when watching
emotional videos simultaneously. Then we proposed a deep neural network on
dyadic-subject EEG signals, in which we combine the dynamic graph convolutional
neural network for characterizing the interpersonal relationships among the EEG
channels and 1-dimension convolution for extracting the information from the
time sequence. To obtain the feature vectors from two EEG recordings that well
represent the relationship of two subjects, we integrate deep canonical
correlation analysis and triplet loss for training the network. Experimental
results show that the social relationship type (stranger or friend) between two
individuals can be effectively identified through their EEG data.
Related papers
- Two in One Go: Single-stage Emotion Recognition with Decoupled Subject-context Transformer [78.35816158511523]
We present a single-stage emotion recognition approach, employing a Decoupled Subject-Context Transformer (DSCT) for simultaneous subject localization and emotion classification.
We evaluate our single-stage framework on two widely used context-aware emotion recognition datasets, CAER-S and EMOTIC.
arXiv Detail & Related papers (2024-04-26T07:30:32Z) - TrajPRed: Trajectory Prediction with Region-based Relation Learning [11.714283460714073]
We propose a region-based relation learning paradigm for predicting human trajectories in traffic scenes.
Social interactions are modeled by relating the temporal changes of local joint information from a global perspective.
We integrate multi-goal estimation and region-based relation learning to model the two stimuli, social interactions, and goals, in a prediction framework.
arXiv Detail & Related papers (2024-04-10T12:31:43Z) - Towards a Unified Transformer-based Framework for Scene Graph Generation
and Human-object Interaction Detection [116.21529970404653]
We introduce SG2HOI+, a unified one-step model based on the Transformer architecture.
Our approach employs two interactive hierarchical Transformers to seamlessly unify the tasks of SGG and HOI detection.
Our approach achieves competitive performance when compared to state-of-the-art HOI methods.
arXiv Detail & Related papers (2023-11-03T07:25:57Z) - Graph AI in Medicine [9.733108180046555]
Graph neural networks (GNNs) process data holistically by viewing modalities as nodes interconnected by their relationships.
GNNs capture information through localized neural transformations defined on graph relationships.
Knowledge graphs can enhance interpretability by aligning model-driven insights with medical knowledge.
arXiv Detail & Related papers (2023-10-20T19:01:01Z) - MedNgage: A Dataset for Understanding Engagement in Patient-Nurse
Conversations [4.847266237348932]
Patients who effectively manage their symptoms often demonstrate higher levels of engagement in conversations and interventions with healthcare practitioners.
It is crucial for AI systems to understand the engagement in natural conversations between patients and practitioners to better contribute toward patient care.
We present a novel dataset (MedNgage) which consists of patient-nurse conversations about cancer symptom management.
arXiv Detail & Related papers (2023-05-31T16:06:07Z) - Co-Located Human-Human Interaction Analysis using Nonverbal Cues: A
Survey [71.43956423427397]
We aim to identify the nonverbal cues and computational methodologies resulting in effective performance.
This survey differs from its counterparts by involving the widest spectrum of social phenomena and interaction settings.
Some major observations are: the most often used nonverbal cue, computational method, interaction environment, and sensing approach are speaking activity, support vector machines, and meetings composed of 3-4 persons equipped with microphones and cameras, respectively.
arXiv Detail & Related papers (2022-07-20T13:37:57Z) - Spatio-Temporal Interaction Graph Parsing Networks for Human-Object
Interaction Recognition [55.7731053128204]
In given video-based Human-Object Interaction scene, modeling thetemporal relationship between humans and objects are the important cue to understand the contextual information presented in the video.
With the effective-temporal relationship modeling, it is possible not only to uncover contextual information in each frame but also directly capture inter-time dependencies.
The full use of appearance features, spatial location and the semantic information are also the key to improve the video-based Human-Object Interaction recognition performance.
arXiv Detail & Related papers (2021-08-19T11:57:27Z) - Deep sr-DDL: Deep Structurally Regularized Dynamic Dictionary Learning
to Integrate Multimodal and Dynamic Functional Connectomics data for
Multidimensional Clinical Characterizations [7.973810752596346]
We propose a novel integrated framework that jointly models complementary information from resting-state functional MRI (rs-fMRI) connectivity and diffusion tensor imaging (DTI) tractography.
Our framework couples a generative model of the connectomics data with a deep network that predicts behavioral scores.
Our hybrid model outperforms several state-of-the-art approaches at clinical outcome prediction and learns interpretable multimodal neural signatures of brain organization.
arXiv Detail & Related papers (2020-08-27T23:43:56Z) - DRG: Dual Relation Graph for Human-Object Interaction Detection [65.50707710054141]
We tackle the challenging problem of human-object interaction (HOI) detection.
Existing methods either recognize the interaction of each human-object pair in isolation or perform joint inference based on complex appearance-based features.
In this paper, we leverage an abstract spatial-semantic representation to describe each human-object pair and aggregate the contextual information of the scene via a dual relation graph.
arXiv Detail & Related papers (2020-08-26T17:59:40Z) - Learning Dynamic and Personalized Comorbidity Networks from Event Data
using Deep Diffusion Processes [102.02672176520382]
Comorbid diseases co-occur and progress via complex temporal patterns that vary among individuals.
In electronic health records we can observe the different diseases a patient has, but can only infer the temporal relationship between each co-morbid condition.
We develop deep diffusion processes to model "dynamic comorbidity networks"
arXiv Detail & Related papers (2020-01-08T15:47:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.