Federated Learning of Nonlinear Temporal Dynamics with Graph Attention-based Cross-Client Interpretability
- URL: http://arxiv.org/abs/2602.13485v1
- Date: Fri, 13 Feb 2026 21:41:52 GMT
- Title: Federated Learning of Nonlinear Temporal Dynamics with Graph Attention-based Cross-Client Interpretability
- Authors: Ayse Tursucular, Ayush Mohanty, Nazal Mohamed, Nagi Gebraeel,
- Abstract summary: We present a framework for learning temporal interdependencies across clients in a decentralized nonlinear system.<n>A central server learns a graph structured neural state transition model over the communicated latent states using a Graph Attention Network.<n>For interpretability we relate the Jacobian of the learned server side transition model to attention coefficients, providing the first interpretable characterization of cross client temporal interdependencies.
- Score: 2.8582274879786684
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Networks of modern industrial systems are increasingly monitored by distributed sensors, where each system comprises multiple subsystems generating high dimensional time series data. These subsystems are often interdependent, making it important to understand how temporal patterns at one subsystem relate to others. This is challenging in decentralized settings where raw measurements cannot be shared and client observations are heterogeneous. In practical deployments each subsystem (client) operates a fixed proprietary model that cannot be modified or retrained, limiting existing approaches. Nonlinear dynamics further make cross client temporal interdependencies difficult to interpret because they are embedded in nonlinear state transition functions. We present a federated framework for learning temporal interdependencies across clients under these constraints. Each client maps high dimensional local observations to low dimensional latent states using a nonlinear state space model. A central server learns a graph structured neural state transition model over the communicated latent states using a Graph Attention Network. For interpretability we relate the Jacobian of the learned server side transition model to attention coefficients, providing the first interpretable characterization of cross client temporal interdependencies in decentralized nonlinear systems. We establish theoretical convergence guarantees to a centralized oracle and validate the framework through synthetic experiments demonstrating convergence, interpretability, scalability and privacy. Additional real world experiments show performance comparable to decentralized baselines.
Related papers
- Learning Unknown Interdependencies for Decentralized Root Cause Analysis in Nonlinear Dynamical Systems [3.122408196953971]
Root cause analysis (RCA) in networked industrial systems is difficult due to unknown and dynamically evolving interdependencies among geographically distributed clients.<n>This paper presents a federated cross-client interdependency learning methodology for feature-partitioned, nonlinear time-series data.<n>We establish theoretical convergence guarantees and validate our approach on extensive simulations and a real-world industrial cybersecurity dataset.
arXiv Detail & Related papers (2026-02-25T14:05:38Z) - KoopGen: Koopman Generator Networks for Representing and Predicting Dynamical Systems with Continuous Spectra [65.11254608352982]
We introduce a generator-based neural Koopman framework that models dynamics through a structured, state-dependent representation of Koopman generators.<n>By exploiting the intrinsic Cartesian decomposition into skew-adjoint and self-adjoint components, KoopGen separates conservative transport from irreversible dissipation.
arXiv Detail & Related papers (2026-02-15T06:32:23Z) - Stable-by-Design Neural Network-Based LPV State-Space Models for System Identification [6.5745172279769255]
We propose a neural network-based state-space model that simultaneously learns latent states and internal scheduling variables.<n>The state-transition matrix is guaranteed to be stable through a Schur-based parameterization.<n>The proposed NN-SS is evaluated on benchmark nonlinear systems, and the results demonstrate that the model consistently matches or surpasses classical subspace identification methods.
arXiv Detail & Related papers (2025-10-21T10:25:54Z) - Generalizable Implicit Neural Representation As a Universal Spatiotemporal Traffic Data Learner [46.866240648471894]
Spatiotemporal Traffic Data (STTD) measures the complex dynamical behaviors of the multiscale transportation system.
We present a novel paradigm to address the STTD learning problem by parameterizing STTD as an implicit neural representation.
We validate its effectiveness through extensive experiments in real-world scenarios, showcasing applications from corridor to network scales.
arXiv Detail & Related papers (2024-06-13T02:03:22Z) - PeFAD: A Parameter-Efficient Federated Framework for Time Series Anomaly Detection [51.20479454379662]
We propose a.
Federated Anomaly Detection framework named PeFAD with the increasing privacy concerns.
We conduct extensive evaluations on four real datasets, where PeFAD outperforms existing state-of-the-art baselines by up to 28.74%.
arXiv Detail & Related papers (2024-06-04T13:51:08Z) - Stragglers-Aware Low-Latency Synchronous Federated Learning via Layer-Wise Model Updates [71.81037644563217]
Synchronous federated learning (FL) is a popular paradigm for collaborative edge learning.
As some of the devices may have limited computational resources and varying availability, FL latency is highly sensitive to stragglers.
We propose straggler-aware layer-wise federated learning (SALF) that leverages the optimization procedure of NNs via backpropagation to update the global model in a layer-wise fashion.
arXiv Detail & Related papers (2024-03-27T09:14:36Z) - A Hierarchical Framework with Spatio-Temporal Consistency Learning for Emergence Detection in Complex Adaptive Systems [41.055298739292695]
Emergence, a global property of complex adaptive systems, is prevalent in real-world dynamic systems, e.g., network-level traffic congestions.
This paper proposes a hierarchical framework with CAS-temporal consistency to solve these two problems by learning the system representation and agent representations.
Our method achieves more detection than traditional methods and deep learning methods on three datasets with well-known yet hard-to-detect emergent behaviors.
arXiv Detail & Related papers (2024-01-18T08:55:05Z) - A Generic Shared Attention Mechanism for Various Backbone Neural Networks [53.36677373145012]
Self-attention modules (SAMs) produce strongly correlated attention maps across different layers.
Dense-and-Implicit Attention (DIA) shares SAMs across layers and employs a long short-term memory module.
Our simple yet effective DIA can consistently enhance various network backbones.
arXiv Detail & Related papers (2022-10-27T13:24:08Z) - Learning from Heterogeneous Data Based on Social Interactions over
Graphs [58.34060409467834]
This work proposes a decentralized architecture, where individual agents aim at solving a classification problem while observing streaming features of different dimensions.
We show that the.
strategy enables the agents to learn consistently under this highly-heterogeneous setting.
We show that the.
strategy enables the agents to learn consistently under this highly-heterogeneous setting.
arXiv Detail & Related papers (2021-12-17T12:47:18Z) - Supervised DKRC with Images for Offline System Identification [77.34726150561087]
Modern dynamical systems are becoming increasingly non-linear and complex.
There is a need for a framework to model these systems in a compact and comprehensive representation for prediction and control.
Our approach learns these basis functions using a supervised learning approach.
arXiv Detail & Related papers (2021-09-06T04:39:06Z) - Multivariate Time Series Imputation by Graph Neural Networks [13.308026049048717]
We introduce a graph neural network architecture, named GRIL, which aims at reconstructing missing data in different channels of a multivariate time series.
Preliminary results show that our model outperforms state-of-the-art methods in the imputation task on relevant benchmarks.
arXiv Detail & Related papers (2021-07-31T17:47:10Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.