Neural Superstatistics for Bayesian Estimation of Dynamic Cognitive
Models
- URL: http://arxiv.org/abs/2211.13165v4
- Date: Wed, 20 Sep 2023 13:26:03 GMT
- Title: Neural Superstatistics for Bayesian Estimation of Dynamic Cognitive
Models
- Authors: Lukas Schumacher, Paul-Christian B\"urkner, Andreas Voss, Ullrich
K\"othe, Stefan T. Radev
- Abstract summary: We develop a simulation-based deep learning method for Bayesian inference, which can recover both time-varying and time-invariant parameters.
Our results show that the deep learning approach is very efficient in capturing the temporal dynamics of the model.
- Score: 2.7391842773173334
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Mathematical models of cognition are often memoryless and ignore potential
fluctuations of their parameters. However, human cognition is inherently
dynamic. Thus, we propose to augment mechanistic cognitive models with a
temporal dimension and estimate the resulting dynamics from a superstatistics
perspective. Such a model entails a hierarchy between a low-level observation
model and a high-level transition model. The observation model describes the
local behavior of a system, and the transition model specifies how the
parameters of the observation model evolve over time. To overcome the
estimation challenges resulting from the complexity of superstatistical models,
we develop and validate a simulation-based deep learning method for Bayesian
inference, which can recover both time-varying and time-invariant parameters.
We first benchmark our method against two existing frameworks capable of
estimating time-varying parameters. We then apply our method to fit a dynamic
version of the diffusion decision model to long time series of human response
times data. Our results show that the deep learning approach is very efficient
in capturing the temporal dynamics of the model. Furthermore, we show that the
erroneous assumption of static or homogeneous parameters will hide important
temporal information.
Related papers
- Supervised Score-Based Modeling by Gradient Boosting [49.556736252628745]
We propose a Supervised Score-based Model (SSM) which can be viewed as a gradient boosting algorithm combining score matching.
We provide a theoretical analysis of learning and sampling for SSM to balance inference time and prediction accuracy.
Our model outperforms existing models in both accuracy and inference time.
arXiv Detail & Related papers (2024-11-02T07:06:53Z) - Modeling Randomly Observed Spatiotemporal Dynamical Systems [7.381752536547389]
Currently available neural network-based modeling approaches fall short when faced with data collected randomly over time and space.
In response, we developed a new method that effectively handles such randomly sampled data.
Our model integrates techniques from amortized variational inference, neural differential equations, neural point processes, and implicit neural representations to predict both the dynamics of the system and the timings and locations of future observations.
arXiv Detail & Related papers (2024-06-01T09:03:32Z) - Towards Learning Stochastic Population Models by Gradient Descent [0.0]
We show that simultaneous estimation of parameters and structure poses major challenges for optimization procedures.
We demonstrate accurate estimation of models but find that enforcing the inference of parsimonious, interpretable models drastically increases the difficulty.
arXiv Detail & Related papers (2024-04-10T14:38:58Z) - Discovering Dynamic Patterns from Spatiotemporal Data with Time-Varying
Low-Rank Autoregression [12.923271427789267]
We develop a time-reduced-rank vector autoregression model whose coefficient are parameterized by low-rank tensor factorization.
In the temporal context, the complex time-varying system behaviors can be revealed by the temporal modes in the proposed model.
arXiv Detail & Related papers (2022-11-28T15:59:52Z) - On the Influence of Enforcing Model Identifiability on Learning dynamics
of Gaussian Mixture Models [14.759688428864159]
We propose a technique for extracting submodels from singular models.
Our method enforces model identifiability during training.
We show how the method can be applied to more complex models like deep neural networks.
arXiv Detail & Related papers (2022-06-17T07:50:22Z) - Time varying regression with hidden linear dynamics [74.9914602730208]
We revisit a model for time-varying linear regression that assumes the unknown parameters evolve according to a linear dynamical system.
Counterintuitively, we show that when the underlying dynamics are stable the parameters of this model can be estimated from data by combining just two ordinary least squares estimates.
arXiv Detail & Related papers (2021-12-29T23:37:06Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Generative Temporal Difference Learning for Infinite-Horizon Prediction [101.59882753763888]
We introduce the $gamma$-model, a predictive model of environment dynamics with an infinite probabilistic horizon.
We discuss how its training reflects an inescapable tradeoff between training-time and testing-time compounding errors.
arXiv Detail & Related papers (2020-10-27T17:54:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.