Neural State-Space Modeling with Latent Causal-Effect Disentanglement
- URL: http://arxiv.org/abs/2209.12387v1
- Date: Mon, 26 Sep 2022 03:02:20 GMT
- Title: Neural State-Space Modeling with Latent Causal-Effect Disentanglement
- Authors: Maryam Toloubidokhti, Ryan Missel, Xiajun Jiang, Niels Otani, Linwei
Wang
- Abstract summary: We discuss a novel technique for reconstructing local activity that, while small in signal strength, is the cause of subsequent global activities that have larger signal strength.
Our central innovation is to approach this by explicitly modeling and disentangling how the latent state of a system is influenced by potential hidden internal interventions.
Because the intervention can not be directly observed but have to be disentangled from the observed subsequent effect, we integrate knowledge of the native intervention-free dynamics of a system, and infer the hidden intervention by assuming it to be responsible for differences observed between the actual and hypothetical intervention-free dynamics.
- Score: 3.5507435095193896
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Despite substantial progress in deep learning approaches to time-series
reconstruction, no existing methods are designed to uncover local activities
with minute signal strength due to their negligible contribution to the
optimization loss. Such local activities however can signify important abnormal
events in physiological systems, such as an extra foci triggering an abnormal
propagation of electrical waves in the heart. We discuss a novel technique for
reconstructing such local activity that, while small in signal strength, is the
cause of subsequent global activities that have larger signal strength. Our
central innovation is to approach this by explicitly modeling and disentangling
how the latent state of a system is influenced by potential hidden internal
interventions. In a novel neural formulation of state-space models (SSMs), we
first introduce causal-effect modeling of the latent dynamics via a system of
interacting neural ODEs that separately describes 1) the continuous-time
dynamics of the internal intervention, and 2) its effect on the trajectory of
the system's native state. Because the intervention can not be directly
observed but have to be disentangled from the observed subsequent effect, we
integrate knowledge of the native intervention-free dynamics of a system, and
infer the hidden intervention by assuming it to be responsible for differences
observed between the actual and hypothetical intervention-free dynamics. We
demonstrated a proof-of-concept of the presented framework on reconstructing
ectopic foci disrupting the course of normal cardiac electrical propagation
from remote observations.
Related papers
- Non-Invasive Reconstruction of Cardiac Activation Dynamics Using Physics-Informed Neural Networks [0.0]
We present a physics-informed neural network framework for recovering cardiac activation patterns.<n>Our approach integrates nonlinear anisotropic modeling, heterogeneous fiber orientation, weak formulations of the governing mechanics, and finite-element loss functions to embed physical constraints directly into training.
arXiv Detail & Related papers (2026-03-04T08:33:16Z) - Recovering Whole-Brain Causal Connectivity under Indirect Observation with Applications to Human EEG and fMRI [18.05064735256717]
INCAMA (INdirect CAusal MAmba) is a latent-space causal discovery framework that accounts for measurement physics to separate neural dynamics from indirect observations.<n>We validate INCAMA on large-scale biophysical simulations across EEG and fMRI, where it significantly outperforms standard pipelines.
arXiv Detail & Related papers (2026-01-30T01:26:28Z) - Unleashing Temporal Capacity of Spiking Neural Networks through Spatiotemporal Separation [67.69345363409835]
Spiking Neural Networks (SNNs) are considered naturally suited for temporal processing, with membrane potential propagation widely regarded as the core temporal modeling mechanism.<n>We design Non-Stateful (NS) models progressively removing membrane propagation to its stage-wise role. Experiments reveal a counterintuitive phenomenon: moderate removal in shallow layers improves performance, while excessive removal causes collapse.
arXiv Detail & Related papers (2025-12-05T07:05:53Z) - Generalised fractional Rabi problem [35.18016233072556]
Fractional quantum dynamics provides a natural framework to capture nonlocal temporal behavior and memory effects in quantum systems.<n>In this work, we analyze the physical consequences of fractional-order quantum evolution using a Green's function formulation based on the Caputo fractional derivative.<n>We find that even in the absence of external driving, the static Hamiltonian term induces non-trivial spin dynamics with damping features directly linked to the fractional temporal nonlocality.
arXiv Detail & Related papers (2025-10-09T12:51:57Z) - Langevin Flows for Modeling Neural Latent Dynamics [81.81271685018284]
We introduce LangevinFlow, a sequential Variational Auto-Encoder where the time evolution of latent variables is governed by the underdamped Langevin equation.<n>Our approach incorporates physical priors -- such as inertia, damping, a learned potential function, and forces -- to represent both autonomous and non-autonomous processes in neural systems.<n>Our method outperforms state-of-the-art baselines on synthetic neural populations generated by a Lorenz attractor.
arXiv Detail & Related papers (2025-07-15T17:57:48Z) - Enhancing Revivals Via Projective Measurements in a Quantum Scarred System [51.3422222472898]
Quantum many-body scarred systems exhibit atypical dynamical behavior, evading thermalization and featuring periodic state revivals.
We investigate the impact of projective measurements on the dynamics in the scar subspace for the paradigmatic PXP model.
We identify a measurement-induced phase resynchronization, countering the natural dephasing of quantum scars, as the key mechanism underlying this phenomenon.
arXiv Detail & Related papers (2025-03-28T17:03:14Z) - Allostatic Control of Persistent States in Spiking Neural Networks for perception and computation [79.16635054977068]
We introduce a novel model for updating perceptual beliefs about the environment by extending the concept of Allostasis to the control of internal representations.
In this paper, we focus on an application in numerical cognition, where a bump of activity in an attractor network is used as a spatial numerical representation.
arXiv Detail & Related papers (2025-03-20T12:28:08Z) - Generative Intervention Models for Causal Perturbation Modeling [80.72074987374141]
In many applications, it is a priori unknown which mechanisms of a system are modified by an external perturbation.
We propose a generative intervention model (GIM) that learns to map these perturbation features to distributions over atomic interventions.
arXiv Detail & Related papers (2024-11-21T10:37:57Z) - Joint trajectory and network inference via reference fitting [0.0]
We propose an approach for leveraging both dynamical and perturbational single cell data to jointly learn cellular trajectories and power network inference.
Our approach is motivated by min-entropy estimation for dynamics and can infer directed and signed networks from time-stamped single cell snapshots.
arXiv Detail & Related papers (2024-09-10T21:49:57Z) - Interpretable Spatio-Temporal Embedding for Brain Structural-Effective Network with Ordinary Differential Equation [56.34634121544929]
In this study, we first construct the brain-effective network via the dynamic causal model.
We then introduce an interpretable graph learning framework termed Spatio-Temporal Embedding ODE (STE-ODE)
This framework incorporates specifically designed directed node embedding layers, aiming at capturing the dynamic interplay between structural and effective networks.
arXiv Detail & Related papers (2024-05-21T20:37:07Z) - Exploring neural oscillations during speech perception via surrogate gradient spiking neural networks [59.38765771221084]
We present a physiologically inspired speech recognition architecture compatible and scalable with deep learning frameworks.
We show end-to-end gradient descent training leads to the emergence of neural oscillations in the central spiking neural network.
Our findings highlight the crucial inhibitory role of feedback mechanisms, such as spike frequency adaptation and recurrent connections, in regulating and synchronising neural activity to improve recognition performance.
arXiv Detail & Related papers (2024-04-22T09:40:07Z) - Inferring Relational Potentials in Interacting Systems [56.498417950856904]
We propose Neural Interaction Inference with Potentials (NIIP) as an alternative approach to discover such interactions.
NIIP assigns low energy to the subset of trajectories which respect the relational constraints observed.
It allows trajectory manipulation, such as interchanging interaction types across separately trained models, as well as trajectory forecasting.
arXiv Detail & Related papers (2023-10-23T00:44:17Z) - Extraction and Recovery of Spatio-Temporal Structure in Latent Dynamics
Alignment with Diffusion Models [1.4756031289693907]
In behavior-related brain computation, it is necessary to align neural signals against drastic domains among them.
We propose an alignment method ERDiff, which leverages the expressivity of the diffusion model to preserve the intrinsic-temporal structure of latent dynamics.
Our method consistently manifests its capability of preserving thetemporal structure of latent dynamics and outperforms existing approaches in alignment goodness-of-fit and neural decoding performance.
arXiv Detail & Related papers (2023-06-09T05:53:11Z) - Information Theoretic Measures of Causal Influences during Transient
Neural Events [2.9327503320877457]
Transient phenomena play a key role in coordinating brain activity at multiple scales.
Key challenge for neural data science is to characterize the network interactions at play during these events.
arXiv Detail & Related papers (2022-09-15T17:51:46Z) - Inference of Affordances and Active Motor Control in Simulated Agents [0.5161531917413706]
We introduce an output-probabilistic, temporally predictive, modular artificial neural network architecture.
We show that our architecture develops latent states that can be interpreted as affordance maps.
In combination with active inference, we show that flexible, goal-directed behavior can be invoked.
arXiv Detail & Related papers (2022-02-23T14:13:04Z) - Continuous Learning and Adaptation with Membrane Potential and
Activation Threshold Homeostasis [91.3755431537592]
This paper presents the Membrane Potential and Activation Threshold Homeostasis (MPATH) neuron model.
The model allows neurons to maintain a form of dynamic equilibrium by automatically regulating their activity when presented with input.
Experiments demonstrate the model's ability to adapt to and continually learn from its input.
arXiv Detail & Related papers (2021-04-22T04:01:32Z) - ACRE: Abstract Causal REasoning Beyond Covariation [90.99059920286484]
We introduce the Abstract Causal REasoning dataset for systematic evaluation of current vision systems in causal induction.
Motivated by the stream of research on causal discovery in Blicket experiments, we query a visual reasoning system with the following four types of questions in either an independent scenario or an interventional scenario.
We notice that pure neural models tend towards an associative strategy under their chance-level performance, whereas neuro-symbolic combinations struggle in backward-blocking reasoning.
arXiv Detail & Related papers (2021-03-26T02:42:38Z) - Neural Ordinary Differential Equations for Intervention Modeling [30.127870899307254]
Real-world systems often involve external interventions that cause changes in the system dynamics.
Neural ODE and a number of its recent variants are not suitable for modeling such interventions as they do not properly model the observations and the interventions separately.
We propose a novel neural ODE-based approach (IMODE) that properly model the effect of external interventions by employing two ODE functions to separately handle the observations and the interventions.
arXiv Detail & Related papers (2020-10-16T10:55:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.