Causally constrained reduced-order neural models of complex turbulent dynamical systems
- URL: http://arxiv.org/abs/2602.13847v2
- Date: Tue, 17 Feb 2026 04:31:35 GMT
- Title: Causally constrained reduced-order neural models of complex turbulent dynamical systems
- Authors: Fabrizio Falasca, Laure Zanna,
- Abstract summary: We introduce a flexible framework based on response theory and score matching to suppress spurious, noncausal dependencies in reduced-order neural emulators of turbulent systems.<n>We showcase the approach using the Charney-DeVore model as a relevant prototype for low-frequency atmospheric variability.
- Score: 0.6094969391643746
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We introduce a flexible framework based on response theory and score matching to suppress spurious, noncausal dependencies in reduced-order neural emulators of turbulent systems, focusing on climate dynamics as a proof-of-concept. We showcase the approach using the stochastic Charney-DeVore model as a relevant prototype for low-frequency atmospheric variability. We show that the resulting causal constraints enhance neural emulators' ability to respond to both weak and strong external forcings, despite being trained exclusively on unforced data. The approach is broadly applicable to modeling complex turbulent dynamical systems in reduced spaces and can be readily integrated into general neural network architectures.
Related papers
- Improving Long-Range Interactions in Graph Neural Simulators via Hamiltonian Dynamics [71.53370807809296]
Recent Graph Neural Simulators (GNSs) accelerate simulations by learning dynamics on graph-structured data.<n>We propose Information-preserving Graph Neural Simulators (IGNS), a graph-based neural simulator built on the principles of Hamiltonian dynamics.<n>IGNS consistently outperforms state-of-the-art GNSs, achieving higher accuracy and stability under challenging and complex dynamical systems.
arXiv Detail & Related papers (2025-11-11T12:53:56Z) - Neuronal Group Communication for Efficient Neural representation [85.36421257648294]
This paper addresses the question of how to build large neural systems that learn efficient, modular, and interpretable representations.<n>We propose Neuronal Group Communication (NGC), a theory-driven framework that reimagines a neural network as a dynamical system of interacting neuronal groups.<n>NGC treats weights as transient interactions between embedding-like neuronal states, with neural computation unfolding through iterative communication among groups of neurons.
arXiv Detail & Related papers (2025-10-19T14:23:35Z) - Langevin Flows for Modeling Neural Latent Dynamics [81.81271685018284]
We introduce LangevinFlow, a sequential Variational Auto-Encoder where the time evolution of latent variables is governed by the underdamped Langevin equation.<n>Our approach incorporates physical priors -- such as inertia, damping, a learned potential function, and forces -- to represent both autonomous and non-autonomous processes in neural systems.<n>Our method outperforms state-of-the-art baselines on synthetic neural populations generated by a Lorenz attractor.
arXiv Detail & Related papers (2025-07-15T17:57:48Z) - High-Fidelity Scientific Simulation Surrogates via Adaptive Implicit Neural Representations [51.90920900332569]
Implicit neural representations (INRs) offer a compact and continuous framework for modeling spatially structured data.<n>Recent approaches address this by introducing additional features along rigid geometric structures.<n>We propose a simple yet effective alternative: Feature-Adaptive INR (FA-INR)
arXiv Detail & Related papers (2025-06-07T16:45:17Z) - Certified Neural Approximations of Nonlinear Dynamics [51.01318247729693]
In safety-critical contexts, the use of neural approximations requires formal bounds on their closeness to the underlying system.<n>We propose a novel, adaptive, and parallelizable verification method based on certified first-order models.
arXiv Detail & Related papers (2025-05-21T13:22:20Z) - High-order expansion of Neural Ordinary Differential Equations flows [4.4569182855550755]
We introduce Event Transitions, a framework based on high-order differentials that provides a rigorous mathematical description of neural ODE dynamics on event gradient.<n>Our findings contribute to a deeper theoretical foundation for event-triggered neural differential equations and provide a mathematical construct for explaining complex system dynamics.
arXiv Detail & Related papers (2025-04-02T08:57:34Z) - Neural Manifolds and Cognitive Consistency: A New Approach to Memory Consolidation in Artificial Systems [0.0]
We introduce a novel mathematical framework that unifies neural population dynamics, hippocampal sharp wave-ripple (SpWR) generation, and cognitive consistency constraints inspired by Heider's theory.<n>Our model leverages low-dimensional manifold representations to capture structured neural drift and incorporates a balance energy function to enforce coherent synaptic interactions.<n>This work paves the way for scalable neuromorphic architectures that bridge neuroscience and artificial intelligence, offering more robust and adaptive learning mechanisms for future intelligent systems.
arXiv Detail & Related papers (2025-02-25T18:28:25Z) - Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - eXponential FAmily Dynamical Systems (XFADS): Large-scale nonlinear Gaussian state-space modeling [9.52474299688276]
We introduce a low-rank structured variational autoencoder framework for nonlinear state-space graphical models.
We show that our approach consistently demonstrates the ability to learn a more predictive generative model.
arXiv Detail & Related papers (2024-03-03T02:19:49Z) - Dynamics with autoregressive neural quantum states: application to
critical quench dynamics [41.94295877935867]
We present an alternative general scheme that enables one to capture long-time dynamics of quantum systems in a stable fashion.
We apply the scheme to time-dependent quench dynamics by investigating the Kibble-Zurek mechanism in the two-dimensional quantum Ising model.
arXiv Detail & Related papers (2022-09-07T15:50:00Z) - Neural annealing and visualization of autoregressive neural networks in
the Newman-Moore model [0.45119235878273]
We show that glassy dynamics exhibited by the Newman-Moore model likely manifests itself through trainability issues and mode collapse in the optimization landscape.
These findings indicate that the glassy dynamics exhibited by the Newman-Moore model caused by the presence of fracton excitations in the configurational space likely manifests itself through trainability issues and mode collapse in the optimization landscape.
arXiv Detail & Related papers (2022-04-24T13:15:28Z) - Evaluating complexity and resilience trade-offs in emerging memory
inference machines [0.6970352368216021]
We show that compact implementations of deep neural networks are unexpectedly susceptible to collapse from multiple system disturbances.
Our work proposes a middle path towards high performance and strong resilience utilizing the Mosaics framework.
arXiv Detail & Related papers (2020-02-25T21:40:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.