Dynamic Emotion Modeling with Learnable Graphs and Graph Inception
Network
- URL: http://arxiv.org/abs/2008.02661v2
- Date: Mon, 8 Feb 2021 12:21:00 GMT
- Title: Dynamic Emotion Modeling with Learnable Graphs and Graph Inception
Network
- Authors: A. Shirian, S. Tripathi, T. Guha
- Abstract summary: We present the Learnable Graph Inception Network (L-GrIN) that jointly learns to recognize emotion and to identify the underlying graph structure in the dynamic data.
We evaluate the proposed architecture on five benchmark emotion recognition databases spanning three different modalities.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Human emotion is expressed, perceived and captured using a variety of dynamic
data modalities, such as speech (verbal), videos (facial expressions) and
motion sensors (body gestures). We propose a generalized approach to emotion
recognition that can adapt across modalities by modeling dynamic data as
structured graphs. The motivation behind the graph approach is to build compact
models without compromising on performance. To alleviate the problem of optimal
graph construction, we cast this as a joint graph learning and classification
task. To this end, we present the Learnable Graph Inception Network (L-GrIN)
that jointly learns to recognize emotion and to identify the underlying graph
structure in the dynamic data. Our architecture comprises multiple novel
components: a new graph convolution operation, a graph inception layer,
learnable adjacency, and a learnable pooling function that yields a graph-level
embedding. We evaluate the proposed architecture on five benchmark emotion
recognition databases spanning three different modalities (video, audio, motion
capture), where each database captures one of the following emotional cues:
facial expressions, speech and body gestures. We achieve state-of-the-art
performance on all five databases outperforming several competitive baselines
and relevant existing methods. Our graph architecture shows superior
performance with significantly fewer parameters (compared to convolutional or
recurrent neural networks) promising its applicability to resource-constrained
devices.
Related papers
- Signed Graph Neural Ordinary Differential Equation for Modeling
Continuous-time Dynamics [13.912268915939656]
The prevailing approach of integrating graph neural networks with ordinary differential equations has demonstrated promising performance.
We introduce a novel approach: a signed graph neural ordinary differential equation, adeptly addressing the limitations of miscapturing signed information.
Our proposed solution boasts both flexibility and efficiency.
arXiv Detail & Related papers (2023-12-18T13:45:33Z) - When Graph Data Meets Multimodal: A New Paradigm for Graph Understanding
and Reasoning [54.84870836443311]
The paper presents a new paradigm for understanding and reasoning about graph data by integrating image encoding and multimodal technologies.
This approach enables the comprehension of graph data through an instruction-response format, utilizing GPT-4V's advanced capabilities.
The study evaluates this paradigm on various graph types, highlighting the model's strengths and weaknesses, particularly in Chinese OCR performance and complex reasoning tasks.
arXiv Detail & Related papers (2023-12-16T08:14:11Z) - State of the Art and Potentialities of Graph-level Learning [54.68482109186052]
Graph-level learning has been applied to many tasks including comparison, regression, classification, and more.
Traditional approaches to learning a set of graphs rely on hand-crafted features, such as substructures.
Deep learning has helped graph-level learning adapt to the growing scale of graphs by extracting features automatically and encoding graphs into low-dimensional representations.
arXiv Detail & Related papers (2023-01-14T09:15:49Z) - TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning [87.38675639186405]
We propose a novel graph neural network approach, called TCL, which deals with the dynamically-evolving graph in a continuous-time fashion.
To the best of our knowledge, this is the first attempt to apply contrastive learning to representation learning on dynamic graphs.
arXiv Detail & Related papers (2021-05-17T15:33:25Z) - GraphFormers: GNN-nested Transformers for Representation Learning on
Textual Graph [53.70520466556453]
We propose GraphFormers, where layerwise GNN components are nested alongside the transformer blocks of language models.
With the proposed architecture, the text encoding and the graph aggregation are fused into an iterative workflow.
In addition, a progressive learning strategy is introduced, where the model is successively trained on manipulated data and original data to reinforce its capability of integrating information on graph.
arXiv Detail & Related papers (2021-05-06T12:20:41Z) - Compact Graph Architecture for Speech Emotion Recognition [0.0]
A compact, efficient and scalable way to represent data is in the form of graphs.
We construct a Graph Convolution Network (GCN)-based architecture that can perform an accurate graph convolution.
Our model achieves comparable performance to the state-of-the-art with significantly fewer learnable parameters.
arXiv Detail & Related papers (2020-08-05T12:09:09Z) - GraphOpt: Learning Optimization Models of Graph Formation [72.75384705298303]
We propose an end-to-end framework that learns an implicit model of graph structure formation and discovers an underlying optimization mechanism.
The learned objective can serve as an explanation for the observed graph properties, thereby lending itself to transfer across different graphs within a domain.
GraphOpt poses link formation in graphs as a sequential decision-making process and solves it using maximum entropy inverse reinforcement learning algorithm.
arXiv Detail & Related papers (2020-07-07T16:51:39Z) - Temporal Graph Networks for Deep Learning on Dynamic Graphs [4.5158585619109495]
We present Temporal Graph Networks (TGNs), a generic, efficient framework for deep learning on dynamic graphs represented as sequences of timed events.
Thanks to a novel combination of memory modules and graph-based operators, TGNs are able to significantly outperform previous approaches being at the same time more computationally efficient.
arXiv Detail & Related papers (2020-06-18T16:06:18Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z) - Unifying Graph Embedding Features with Graph Convolutional Networks for
Skeleton-based Action Recognition [18.001693718043292]
We propose a novel framework, which unifies 15 graph embedding features into the graph convolutional network for human action recognition.
Our model is validated by three large-scale datasets, namely NTU-RGB+D, Kinetics and SYSU-3D.
arXiv Detail & Related papers (2020-03-06T02:31:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.