Scientific Machine Learning of Chaotic Systems Discovers Governing Equations for Neural Populations
- URL: http://arxiv.org/abs/2507.03631v2
- Date: Thu, 10 Jul 2025 21:26:10 GMT
- Title: Scientific Machine Learning of Chaotic Systems Discovers Governing Equations for Neural Populations
- Authors: Anthony G. Chesebro, David Hofmann, Vaibhav Dixit, Earl K. Miller, Richard H. Granger, Alan Edelman, Christopher V. Rackauckas, Lilianne R. Mujica-Parodi, Helmut H. Strey,
- Abstract summary: We introduce the PEM-UDE method to extract interpretable mathematical expressions from chaotic dynamical systems.<n>When applied to neural populations, our method derives novel governing equations that respect biological constraints.<n>These equations predict an emergent relationship between connection density and both oscillation frequency and synchrony in neural circuits.
- Score: 0.05804487044220691
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Discovering governing equations that describe complex chaotic systems remains a fundamental challenge in physics and neuroscience. Here, we introduce the PEM-UDE method, which combines the prediction-error method with universal differential equations to extract interpretable mathematical expressions from chaotic dynamical systems, even with limited or noisy observations. This approach succeeds where traditional techniques fail by smoothing optimization landscapes and removing the chaotic properties during the fitting process without distorting optimal parameters. We demonstrate its efficacy by recovering hidden states in the Rossler system and reconstructing dynamics from noise-corrupted electrical circuit data, where the correct functional form of the dynamics is recovered even when one of the observed time series is corrupted by noise 5x the magnitude of the true signal. We demonstrate that this method is capable of recovering the correct dynamics, whereas direct symbolic regression methods, such as SINDy, fail to do so with the given amount of data and noise. Importantly, when applied to neural populations, our method derives novel governing equations that respect biological constraints such as network sparsity - a constraint necessary for cortical information processing yet not captured in next-generation neural mass models - while preserving microscale neuronal parameters. These equations predict an emergent relationship between connection density and both oscillation frequency and synchrony in neural circuits. We validate these predictions using three intracranial electrode recording datasets from the medial entorhinal cortex, prefrontal cortex, and orbitofrontal cortex. Our work provides a pathway to develop mechanistic, multi-scale brain models that generalize across diverse neural architectures, bridging the gap between single-neuron dynamics and macroscale brain activity.
Related papers
- Langevin Flows for Modeling Neural Latent Dynamics [81.81271685018284]
We introduce LangevinFlow, a sequential Variational Auto-Encoder where the time evolution of latent variables is governed by the underdamped Langevin equation.<n>Our approach incorporates physical priors -- such as inertia, damping, a learned potential function, and forces -- to represent both autonomous and non-autonomous processes in neural systems.<n>Our method outperforms state-of-the-art baselines on synthetic neural populations generated by a Lorenz attractor.
arXiv Detail & Related papers (2025-07-15T17:57:48Z) - NOBLE -- Neural Operator with Biologically-informed Latent Embeddings to Capture Experimental Variability in Biological Neuron Models [68.89389652724378]
NOBLE is a neural operator framework that learns a mapping from a continuous frequency-modulated embedding of interpretable neuron features to the somatic voltage response induced by current injection.<n>It predicts distributions of neural dynamics accounting for the intrinsic experimental variability.<n>NOBLE is the first scaled-up deep learning framework validated on real experimental data.
arXiv Detail & Related papers (2025-06-05T01:01:18Z) - Certified Neural Approximations of Nonlinear Dynamics [52.79163248326912]
In safety-critical contexts, the use of neural approximations requires formal bounds on their closeness to the underlying system.<n>We propose a novel, adaptive, and parallelizable verification method based on certified first-order models.
arXiv Detail & Related papers (2025-05-21T13:22:20Z) - Generative Modeling of Neural Dynamics via Latent Stochastic Differential Equations [1.5467259918426441]
We propose a framework for developing computational models of biological neural systems.<n>We employ a system of coupled differential equations with differentiable drift and diffusion functions.<n>We show that these hybrid models achieve competitive performance in predicting stimulus-evoked neural and behavioral responses.
arXiv Detail & Related papers (2024-12-01T09:36:03Z) - A scalable generative model for dynamical system reconstruction from neuroimaging data [5.777167013394619]
Data-driven inference of the generative dynamics underlying a set of observed time series is of growing interest in machine learning.
Recent breakthroughs in training techniques for state space models (SSMs) specifically geared toward dynamical systems reconstruction (DSR) enable to recover the underlying system.
We propose a novel algorithm that solves this problem and scales exceptionally well with model dimensionality and filter length.
arXiv Detail & Related papers (2024-11-05T09:45:57Z) - Inferring stochastic low-rank recurrent neural networks from neural data [5.179844449042386]
A central aim in computational neuroscience is to relate the activity of large neurons to an underlying dynamical system.<n>Low-rank recurrent neural networks (RNNs) exhibit such interpretability by having tractable dynamics.<n>Here, we propose to fit low-rank RNNs with variational sequential Monte Carlo methods.
arXiv Detail & Related papers (2024-06-24T15:57:49Z) - Individualized Dosing Dynamics via Neural Eigen Decomposition [51.62933814971523]
We introduce the Neural Eigen Differential Equation algorithm (NESDE)
NESDE provides individualized modeling, tunable generalization to new treatment policies, and fast, continuous, closed-form prediction.
We demonstrate the robustness of NESDE in both synthetic and real medical problems, and use the learned dynamics to publish simulated medical gym environments.
arXiv Detail & Related papers (2023-06-24T17:01:51Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Learning low-dimensional dynamics from whole-brain data improves task
capture [2.82277518679026]
We introduce a novel approach to learning low-dimensional approximations of neural dynamics by using a sequential variational autoencoder (SVAE)
Our method finds smooth dynamics that can predict cognitive processes with accuracy higher than classical methods.
We evaluate our approach on various task-fMRI datasets, including motor, working memory, and relational processing tasks.
arXiv Detail & Related papers (2023-05-18T18:43:13Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Contrastive-Signal-Dependent Plasticity: Self-Supervised Learning in Spiking Neural Circuits [61.94533459151743]
This work addresses the challenge of designing neurobiologically-motivated schemes for adjusting the synapses of spiking networks.
Our experimental simulations demonstrate a consistent advantage over other biologically-plausible approaches when training recurrent spiking networks.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - Is the brain macroscopically linear? A system identification of resting
state dynamics [7.312557272609717]
A central challenge in the computational modeling of neural dynamics is the trade-off between accuracy and simplicity.
We provide a rigorous and data-driven investigation of this hypothesis at the level of whole-brain blood-oxygen-level-dependent (BOLD) and macroscopic field potential dynamics.
Our results can greatly facilitate our understanding of macroscopic neural dynamics and the principled design of model-based interventions for the treatment of neuropsychiatric disorders.
arXiv Detail & Related papers (2020-12-22T20:51:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.