Tensor State Space-based Dynamic Multilayer Network Modeling
- URL: http://arxiv.org/abs/2506.02413v1
- Date: Tue, 03 Jun 2025 03:58:32 GMT
- Title: Tensor State Space-based Dynamic Multilayer Network Modeling
- Authors: Tian Lan, Jie Guo, Chen Zhang,
- Abstract summary: Existing models often fail to capture such networks' temporal and cross-layer dynamics.<n>This paper introduces a novel State Space Model for Dynamic Multilayer Networks (TSSDMN), utilizing a latent space model framework.<n> Numerical simulations and case studies demonstrate the efficacy of TSSDMN for understanding dynamic multilayer networks.
- Score: 24.860214033275515
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Understanding the complex interactions within dynamic multilayer networks is critical for advancements in various scientific domains. Existing models often fail to capture such networks' temporal and cross-layer dynamics. This paper introduces a novel Tensor State Space Model for Dynamic Multilayer Networks (TSSDMN), utilizing a latent space model framework. TSSDMN employs a symmetric Tucker decomposition to represent latent node features, their interaction patterns, and layer transitions. Then by fixing the latent features and allowing the interaction patterns to evolve over time, TSSDMN uniquely captures both the temporal dynamics within layers and across different layers. The model identifiability conditions are discussed. By treating latent features as variables whose posterior distributions are approximated using a mean-field variational inference approach, a variational Expectation Maximization algorithm is developed for efficient model inference. Numerical simulations and case studies demonstrate the efficacy of TSSDMN for understanding dynamic multilayer networks.
Related papers
- Fractional Spike Differential Equations Neural Network with Efficient Adjoint Parameters Training [63.3991315762955]
Spiking Neural Networks (SNNs) draw inspiration from biological neurons to create realistic models for brain-like computation.<n>Most existing SNNs assume a single time constant for neuronal membrane voltage dynamics, modeled by first-order ordinary differential equations (ODEs) with Markovian characteristics.<n>We propose the Fractional SPIKE Differential Equation neural network (fspikeDE), which captures long-term dependencies in membrane voltage and spike trains through fractional-order dynamics.
arXiv Detail & Related papers (2025-07-22T18:20:56Z) - Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces novel deep dynamical models designed to represent continuous-time sequences.<n>We train the model using maximum likelihood estimation with Markov chain Monte Carlo.<n> Experimental results on oscillating systems, videos and real-world state sequences (MuJoCo) demonstrate that our model with the learnable energy-based prior outperforms existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - Attractor Memory for Long-Term Time Series Forecasting: A Chaos Perspective [63.60312929416228]
textbftextitAttraos incorporates chaos theory into long-term time series forecasting.
We show that Attraos outperforms various LTSF methods on mainstream datasets and chaotic datasets with only one-twelfth of the parameters compared to PatchTST.
arXiv Detail & Related papers (2024-02-18T05:35:01Z) - Learning Joint Latent Space EBM Prior Model for Multi-layer Generator [44.4434704520236]
We study the fundamental problem of learning multi-layer generator models.
We propose an energy-based model (EBM) on the joint latent space over all layers of latent variables.
Our experiments demonstrate that the learned model can be expressive in generating high-quality images.
arXiv Detail & Related papers (2023-06-10T00:27:37Z) - Piecewise-Velocity Model for Learning Continuous-time Dynamic Node
Representations [0.0]
Piecewise-Veable Model (PiVeM) for representation of continuous-time dynamic networks.
We show that PiVeM can successfully represent network structure and dynamics in ultra-low two-dimensional spaces.
It outperforms relevant state-of-art methods in downstream tasks such as link prediction.
arXiv Detail & Related papers (2022-12-23T13:57:56Z) - A Mutually Exciting Latent Space Hawkes Process Model for
Continuous-time Networks [3.883893461313154]
We propose a novel generative model for continuous-time networks of relational events using a latent space representation for nodes.
We show that our proposed LSH model can replicate many features observed in real temporal networks including reciprocity and transitivity.
arXiv Detail & Related papers (2022-05-19T00:56:12Z) - Multi-Scale Semantics-Guided Neural Networks for Efficient
Skeleton-Based Human Action Recognition [140.18376685167857]
A simple yet effective multi-scale semantics-guided neural network is proposed for skeleton-based action recognition.
MS-SGN achieves the state-of-the-art performance on the NTU60, NTU120, and SYSU datasets.
arXiv Detail & Related papers (2021-11-07T03:50:50Z) - TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning [87.38675639186405]
We propose a novel graph neural network approach, called TCL, which deals with the dynamically-evolving graph in a continuous-time fashion.
To the best of our knowledge, this is the first attempt to apply contrastive learning to representation learning on dynamic graphs.
arXiv Detail & Related papers (2021-05-17T15:33:25Z) - Multi-View Dynamic Heterogeneous Information Network Embedding [3.8093526291513347]
We propose a novel framework for incorporating temporal information into HIN embedding, denoted as Multi-View Dynamic HIN Embedding (MDHNE)
Our proposed MDHNE applies Recurrent Neural Network (RNN) to incorporate evolving pattern of complex network structure and semantic relationships between nodes into latent embedding spaces.
Our model outperforms state-of-the-art baselines on three real-world dynamic datasets for different network mining tasks.
arXiv Detail & Related papers (2020-11-12T12:33:29Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - MagNet: Discovering Multi-agent Interaction Dynamics using Neural
Network [11.285833408524708]
MagNet is a neural network-based multi-agent interaction model.
It is trained to discover the core dynamics of a multi-agent system from observations.
It is tuned on-line to learn agent-specific parameters to ensure accurate prediction.
arXiv Detail & Related papers (2020-01-24T13:41:01Z) - Relational State-Space Model for Stochastic Multi-Object Systems [24.234120525358456]
This paper introduces the relational state-space model (R-SSM), a sequential hierarchical latent variable model.
R-SSM makes use of graph neural networks (GNNs) to simulate the joint state transitions of multiple correlated objects.
The utility of R-SSM is empirically evaluated on synthetic and real time-series datasets.
arXiv Detail & Related papers (2020-01-13T03:45:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.