Modeling Neural Activity with Conditionally Linear Dynamical Systems
- URL: http://arxiv.org/abs/2502.18347v1
- Date: Tue, 25 Feb 2025 16:36:24 GMT
- Title: Modeling Neural Activity with Conditionally Linear Dynamical Systems
- Authors: Victor Geadah, Amin Nejatbakhsh, David Lipshutz, Jonathan W. Pillow, Alex H. Williams,
- Abstract summary: We develop Conditionally Linear Dynamical System models as a general-purpose method to characterize these dynamics.<n>We find that CLDS models can perform well even in severely data-limited regimes.<n>In example applications, we apply CLDS to model thalamic neurons that nonlinearly encode heading direction and to model motor cortical neurons during a cued reaching task
- Score: 14.902340082626653
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural population activity exhibits complex, nonlinear dynamics, varying in time, over trials, and across experimental conditions. Here, we develop Conditionally Linear Dynamical System (CLDS) models as a general-purpose method to characterize these dynamics. These models use Gaussian Process (GP) priors to capture the nonlinear dependence of circuit dynamics on task and behavioral variables. Conditioned on these covariates, the data is modeled with linear dynamics. This allows for transparent interpretation and tractable Bayesian inference. We find that CLDS models can perform well even in severely data-limited regimes (e.g. one trial per condition) due to their Bayesian formulation and ability to share statistical power across nearby task conditions. In example applications, we apply CLDS to model thalamic neurons that nonlinearly encode heading direction and to model motor cortical neurons during a cued reaching task
Related papers
- Generative System Dynamics in Recurrent Neural Networks [56.958984970518564]
We investigate the continuous time dynamics of Recurrent Neural Networks (RNNs)
We show that skew-symmetric weight matrices are fundamental to enable stable limit cycles in both linear and nonlinear configurations.
Numerical simulations showcase how nonlinear activation functions not only maintain limit cycles, but also enhance the numerical stability of the system integration process.
arXiv Detail & Related papers (2025-04-16T10:39:43Z) - Hybrid Adaptive Modeling using Neural Networks Trained with Nonlinear Dynamics Based Features [5.652228574188242]
This paper introduces a novel approach that departs from standard techniques by uncovering information from nonlinear dynamical modeling and embedding it in data-based models.
By explicitly incorporating nonlinear dynamic phenomena through perturbation methods, the predictive capabilities are more realistic and insightful compared to knowledge obtained from brute-force numerical simulations.
arXiv Detail & Related papers (2025-01-21T02:38:28Z) - Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces novel deep dynamical models designed to represent continuous-time sequences.<n>We train the model using maximum likelihood estimation with Markov chain Monte Carlo.<n> Experimental results on oscillating systems, videos and real-world state sequences (MuJoCo) demonstrate that our model with the learnable energy-based prior outperforms existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - Probabilistic Decomposed Linear Dynamical Systems for Robust Discovery of Latent Neural Dynamics [5.841659874892801]
Time-varying linear state-space models are powerful tools for obtaining mathematically interpretable representations of neural signals.
Existing methods for latent variable estimation are not robust to dynamical noise and system nonlinearity.
We propose a probabilistic approach to latent variable estimation in decomposed models that improves robustness against dynamical noise.
arXiv Detail & Related papers (2024-08-29T18:58:39Z) - Modeling Latent Neural Dynamics with Gaussian Process Switching Linear Dynamical Systems [2.170477444239546]
We develop an approach that balances these two objectives: the Gaussian Process Switching Linear Dynamical System (gpSLDS)<n>Our method builds on previous work modeling the latent state evolution via a differential equation whose nonlinear dynamics are described by a Gaussian process (GP-SDEs)<n>Our approach resolves key limitations of the rSLDS such as artifactual oscillations in dynamics near discrete state boundaries, while also providing posterior uncertainty estimates of the dynamics.
arXiv Detail & Related papers (2024-07-19T15:32:15Z) - Latent Dynamical Implicit Diffusion Processes [0.0]
We propose a novel latent variable model named latent dynamical implicit diffusion processes (LDIDPs)
LDIDPs utilize implicit diffusion processes to sample from dynamical latent processes and generate sequential observation samples accordingly.
We demonstrate that LDIDPs can accurately learn the dynamics over latent dimensions.
arXiv Detail & Related papers (2023-06-12T12:43:27Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Neural Abstractions [72.42530499990028]
We present a novel method for the safety verification of nonlinear dynamical models that uses neural networks to represent abstractions of their dynamics.
We demonstrate that our approach performs comparably to the mature tool Flow* on existing benchmark nonlinear models.
arXiv Detail & Related papers (2023-01-27T12:38:09Z) - Stability Preserving Data-driven Models With Latent Dynamics [0.0]
We introduce a data-driven modeling approach for dynamics problems with latent variables.
We present a model framework where the stability of the coupled dynamics can be easily enforced.
arXiv Detail & Related papers (2022-04-20T00:41:10Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.