Tensor Network Framework for Forecasting Nonlinear and Chaotic Dynamics
- URL: http://arxiv.org/abs/2511.09233v1
- Date: Thu, 13 Nov 2025 01:41:52 GMT
- Title: Tensor Network Framework for Forecasting Nonlinear and Chaotic Dynamics
- Authors: Jia-Bin You, Jian Feng Kong, Jun Ye,
- Abstract summary: We present a tensor network model (TNM) for forecasting nonlinear and chaotic dynamics.<n>We show that the TNM accurately reconstructs short-term trajectories and faithfully captures the attractor geometry.
- Score: 1.790605517028706
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present a tensor network model (TNM) for forecasting nonlinear and chaotic dynamics, bridging quantum many-body methods with classical complex systems. The TNM leverages hierarchical tensor contractions to encode non-Markovian temporal correlations and multiscale structures, enabling compact and interpretable representations of chaotic flows. Using the Lorenz and Rössler systems as benchmarks, we show that the TNM accurately reconstructs short-term trajectories and faithfully captures the attractor geometry. The model enables robust short-term forecasting beyond several Lyapunov times, offering a meaningful horizon for data-driven prediction under chaos. Inhomogeneous parametrization of weight tensors improves convergence and robustness compared to homogeneous parametrization, while scaling with bond dimension reveals saturation beyond modest values, consistent with the low intrinsic dimensionality of the chaotic attractor. This work establishes tensor networks as a universal paradigm for data-driven modeling of complex dynamical systems, offering physically motivated control of model expressivity and opening pathways toward applications in climate systems and hybrid quantum-classical simulations.
Related papers
- Graph neural network force fields for adiabatic dynamics of lattice Hamiltonians [0.0]
We develop a graph neural network (GNN)-based force-field framework for the adiabatic dynamics of lattice Hamiltonians.<n>Trained on exact-diagonalization data, the GNN achieves high force accuracy, strict linear scaling with system size, and directability to large lattices.<n>These results establish GNNs as an elegant and efficient architecture for symmetry-aware, large-scale dynamical simulations of correlated lattice systems.
arXiv Detail & Related papers (2026-03-02T16:23:25Z) - KoopGen: Koopman Generator Networks for Representing and Predicting Dynamical Systems with Continuous Spectra [65.11254608352982]
We introduce a generator-based neural Koopman framework that models dynamics through a structured, state-dependent representation of Koopman generators.<n>By exploiting the intrinsic Cartesian decomposition into skew-adjoint and self-adjoint components, KoopGen separates conservative transport from irreversible dissipation.
arXiv Detail & Related papers (2026-02-15T06:32:23Z) - Learning to Dissipate Energy in Oscillatory State-Space Models [51.98491034847041]
State-space models (SSMs) are a class of networks for sequence learning.<n>We show that D-LinOSS consistently outperforms previous LinOSS methods on long-range learning tasks.
arXiv Detail & Related papers (2025-05-17T23:15:17Z) - Predicting Forced Responses of Probability Distributions via the Fluctuation-Dissipation Theorem and Generative Modeling [0.0]
We present a data-driven framework for estimating the response of higher-order moments of nonlinear systems to small external perturbations.<n>We combine GFDT with score-based generative modeling to estimate the system's score function directly from data.<n>Our method is validated on several models relevant to climate dynamics.
arXiv Detail & Related papers (2025-04-17T20:54:33Z) - A short trajectory is all you need: A transformer-based model for long-time dissipative quantum dynamics [0.0]
We show that a deep artificial neural network can predict the long-time population dynamics of a quantum system coupled to a dissipative environment.
Our model is more accurate than classical forecasting models, such as recurrent neural networks.
arXiv Detail & Related papers (2024-09-17T16:17:52Z) - Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces novel deep dynamical models designed to represent continuous-time sequences.<n>We train the model using maximum likelihood estimation with Markov chain Monte Carlo.<n> Experimental results on oscillating systems, videos and real-world state sequences (MuJoCo) demonstrate that our model with the learnable energy-based prior outperforms existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - Attractor Memory for Long-Term Time Series Forecasting: A Chaos Perspective [63.60312929416228]
textbftextitAttraos incorporates chaos theory into long-term time series forecasting.
We show that Attraos outperforms various LTSF methods on mainstream datasets and chaotic datasets with only one-twelfth of the parameters compared to PatchTST.
arXiv Detail & Related papers (2024-02-18T05:35:01Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Evolve Smoothly, Fit Consistently: Learning Smooth Latent Dynamics For
Advection-Dominated Systems [14.553972457854517]
We present a data-driven, space-time continuous framework to learn surrogatemodels for complex physical systems.
We leverage the expressive power of the network and aspecially designed consistency-inducing regularization to obtain latent trajectories that are both low-dimensional and smooth.
arXiv Detail & Related papers (2023-01-25T03:06:03Z) - Predicting Physics in Mesh-reduced Space with Temporal Attention [15.054026802351146]
We propose a new method that captures long-term dependencies through a transformer-style temporal attention model.
Our method outperforms a competitive GNN baseline on several complex fluid dynamics prediction tasks.
We believe our approach paves the way to bringing the benefits of attention-based sequence models to solving high-dimensional complex physics tasks.
arXiv Detail & Related papers (2022-01-22T18:32:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.