TANGO: Graph Neural Dynamics via Learned Energy and Tangential Flows
- URL: http://arxiv.org/abs/2508.05070v1
- Date: Thu, 07 Aug 2025 06:44:01 GMT
- Title: TANGO: Graph Neural Dynamics via Learned Energy and Tangential Flows
- Authors: Moshe Eliasof, Eldad Haber, Carola-Bibiane Schönlieb,
- Abstract summary: We introduce TANGO -- a dynamical systems inspired framework for graph representation learning.<n>At the core of our approach is a learnable Lyapunov function over node embeddings.<n>We incorporate a novel tangential component, learned via message passing, that evolves features while maintaining the energy value.
- Score: 17.546965223021786
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce TANGO -- a dynamical systems inspired framework for graph representation learning that governs node feature evolution through a learned energy landscape and its associated descent dynamics. At the core of our approach is a learnable Lyapunov function over node embeddings, whose gradient defines an energy-reducing direction that guarantees convergence and stability. To enhance flexibility while preserving the benefits of energy-based dynamics, we incorporate a novel tangential component, learned via message passing, that evolves features while maintaining the energy value. This decomposition into orthogonal flows of energy gradient descent and tangential evolution yields a flexible form of graph dynamics, and enables effective signal propagation even in flat or ill-conditioned energy regions, that often appear in graph learning. Our method mitigates oversquashing and is compatible with different graph neural network backbones. Empirically, TANGO achieves strong performance across a diverse set of node and graph classification and regression benchmarks, demonstrating the effectiveness of jointly learned energy functions and tangential flows for graph neural networks.
Related papers
- Depth-Adaptive Graph Neural Networks via Learnable Bakry-'Emery Curvature [7.2716257100195385]
Graph Neural Networks (GNNs) have demonstrated strong representation learning capabilities for graph-based tasks.<n>Recent advances on GNNs leverage geometric properties, such as curvature, to enhance its representation capabilities.<n>We propose integrating Bakry-'Emery curvature, which captures both structural and task-driven aspects of information propagation.
arXiv Detail & Related papers (2025-03-03T00:48:41Z) - Dirichlet Energy Enhancement of Graph Neural Networks by Framelet
Augmentation [19.56268823452656]
We introduce a framelet system into the analysis of Dirichlet energy and take a multi-scale perspective to leverage the Dirichlet energy.
Based on that, we design the Energy Enhanced Convolution (EEConv), which is an effective and practical operation.
Experiments show that deep GNNs with EEConv achieve state-of-the-art performance over various node classification datasets.
arXiv Detail & Related papers (2023-11-09T22:22:18Z) - Joint Feature and Differentiable $ k $-NN Graph Learning using Dirichlet
Energy [103.74640329539389]
We propose a deep FS method that simultaneously conducts feature selection and differentiable $ k $-NN graph learning.
We employ Optimal Transport theory to address the non-differentiability issue of learning $ k $-NN graphs in neural networks.
We validate the effectiveness of our model with extensive experiments on both synthetic and real-world datasets.
arXiv Detail & Related papers (2023-05-21T08:15:55Z) - Learning Dynamic Graph Embeddings with Neural Controlled Differential
Equations [21.936437653875245]
This paper focuses on representation learning for dynamic graphs with temporal interactions.
We propose a generic differential model for dynamic graphs that characterises the continuously dynamic evolution of node embedding trajectories.
Our framework exhibits several desirable characteristics, including the ability to express dynamics on evolving graphs without integration by segments.
arXiv Detail & Related papers (2023-02-22T12:59:38Z) - Generalized energy and gradient flow via graph framelets [27.71018932795014]
We provide a theoretical understanding of the framelet-based graph neural networks through the perspective of energy gradient flow.
By viewing the framelet-based models as discretized gradient flows of some energy, we show it can induce both low-frequency and high-frequency-dominated dynamics.
arXiv Detail & Related papers (2022-10-08T23:40:45Z) - Time-aware Dynamic Graph Embedding for Asynchronous Structural Evolution [60.695162101159134]
Existing works merely view a dynamic graph as a sequence of changes.
We formulate dynamic graphs as temporal edge sequences associated with joining time of.
vertex and timespan of edges.
A time-aware Transformer is proposed to embed.
vertex' dynamic connections and ToEs into the learned.
vertex representations.
arXiv Detail & Related papers (2022-07-01T15:32:56Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Graph-Coupled Oscillator Networks [23.597444325599835]
Graph-Coupled Networks (GraphCON) is a novel framework for deep learning on graphs.
We show that our framework offers competitive performance with respect to the state-of-the-art on a variety of graph-based learning tasks.
arXiv Detail & Related papers (2022-02-04T18:29:49Z) - Spectral Graph Convolutional Networks With Lifting-based Adaptive Graph
Wavelets [81.63035727821145]
Spectral graph convolutional networks (SGCNs) have been attracting increasing attention in graph representation learning.
We propose a novel class of spectral graph convolutional networks that implement graph convolutions with adaptive graph wavelets.
arXiv Detail & Related papers (2021-08-03T17:57:53Z) - Geometrically Principled Connections in Graph Neural Networks [66.51286736506658]
We argue geometry should remain the primary driving force behind innovation in the emerging field of geometric deep learning.
We relate graph neural networks to widely successful computer graphics and data approximation models: radial basis functions (RBFs)
We introduce affine skip connections, a novel building block formed by combining a fully connected layer with any graph convolution operator.
arXiv Detail & Related papers (2020-04-06T13:25:46Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.