Deep learning and differential equations for modeling changes in
individual-level latent dynamics between observation periods
- URL: http://arxiv.org/abs/2202.07403v1
- Date: Tue, 15 Feb 2022 13:53:42 GMT
- Title: Deep learning and differential equations for modeling changes in
individual-level latent dynamics between observation periods
- Authors: G\"oran K\"ober, Raffael Kalisch, Lara Puhlmann, Andrea Chmitorz,
Anita Schick, and Harald Binder
- Abstract summary: We propose an extension where different sets of differential equation parameters are allowed for observation sub-periods.
We derive prediction targets from individual dynamic models of resilience in the application.
Our approach is seen to successfully identify individual-level parameters of dynamic models that allows us to stably select predictors.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: When modeling longitudinal biomedical data, often dimensionality reduction as
well as dynamic modeling in the resulting latent representation is needed. This
can be achieved by artificial neural networks for dimension reduction, and
differential equations for dynamic modeling of individual-level trajectories.
However, such approaches so far assume that parameters of individual-level
dynamics are constant throughout the observation period. Motivated by an
application from psychological resilience research, we propose an extension
where different sets of differential equation parameters are allowed for
observation sub-periods. Still, estimation for intra-individual sub-periods is
coupled for being able to fit the model also with a relatively small dataset.
We subsequently derive prediction targets from individual dynamic models of
resilience in the application. These serve as interpretable resilience-related
outcomes, to be predicted from characteristics of individuals, measured at
baseline and a follow-up time point, and selecting a small set of important
predictors. Our approach is seen to successfully identify individual-level
parameters of dynamic models that allows us to stably select predictors, i.e.,
resilience factors. Furthermore, we can identify those characteristics of
individuals that are the most promising for updates at follow-up, which might
inform future study design. This underlines the usefulness of our proposed deep
dynamic modeling approach with changes in parameters between observation
sub-periods.
Related papers
- Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces a novel family of deep dynamical models designed to represent continuous-time sequence data.
We train the model using maximum likelihood estimation with Markov chain Monte Carlo.
Experiments on oscillating systems, videos and real-world state sequences (MuJoCo) illustrate that ODEs with the learnable energy-based prior outperform existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - Neural Persistence Dynamics [8.197801260302642]
We consider the problem of learning the dynamics in the topology of time-evolving point clouds.
Our proposed model -- staticitneural persistence dynamics -- substantially outperforms the state-of-the-art across a diverse set of parameter regression tasks.
arXiv Detail & Related papers (2024-05-24T17:20:18Z) - Synthetic location trajectory generation using categorical diffusion
models [50.809683239937584]
Diffusion models (DPMs) have rapidly evolved to be one of the predominant generative models for the simulation of synthetic data.
We propose using DPMs for the generation of synthetic individual location trajectories (ILTs) which are sequences of variables representing physical locations visited by individuals.
arXiv Detail & Related papers (2024-02-19T15:57:39Z) - Anamnesic Neural Differential Equations with Orthogonal Polynomial
Projections [6.345523830122166]
We propose PolyODE, a formulation that enforces long-range memory and preserves a global representation of the underlying dynamical system.
Our construction is backed by favourable theoretical guarantees and we demonstrate that it outperforms previous works in the reconstruction of past and future data.
arXiv Detail & Related papers (2023-03-03T10:49:09Z) - Neural Superstatistics for Bayesian Estimation of Dynamic Cognitive
Models [2.7391842773173334]
We develop a simulation-based deep learning method for Bayesian inference, which can recover both time-varying and time-invariant parameters.
Our results show that the deep learning approach is very efficient in capturing the temporal dynamics of the model.
arXiv Detail & Related papers (2022-11-23T17:42:53Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - Mixed Effects Neural ODE: A Variational Approximation for Analyzing the
Dynamics of Panel Data [50.23363975709122]
We propose a probabilistic model called ME-NODE to incorporate (fixed + random) mixed effects for analyzing panel data.
We show that our model can be derived using smooth approximations of SDEs provided by the Wong-Zakai theorem.
We then derive Evidence Based Lower Bounds for ME-NODE, and develop (efficient) training algorithms.
arXiv Detail & Related papers (2022-02-18T22:41:51Z) - Time varying regression with hidden linear dynamics [74.9914602730208]
We revisit a model for time-varying linear regression that assumes the unknown parameters evolve according to a linear dynamical system.
Counterintuitively, we show that when the underlying dynamics are stable the parameters of this model can be estimated from data by combining just two ordinary least squares estimates.
arXiv Detail & Related papers (2021-12-29T23:37:06Z) - SyMetric: Measuring the Quality of Learnt Hamiltonian Dynamics Inferred
from Vision [73.26414295633846]
A recently proposed class of models attempts to learn latent dynamics from high-dimensional observations.
Existing methods rely on image reconstruction quality, which does not always reflect the quality of the learnt latent dynamics.
We develop a set of new measures, including a binary indicator of whether the underlying Hamiltonian dynamics have been faithfully captured.
arXiv Detail & Related papers (2021-11-10T23:26:58Z) - Disentangled Generative Models for Robust Prediction of System Dynamics [2.6424064030995957]
In this work, we treat the domain parameters of dynamical systems as factors of variation of the data generating process.
By leveraging ideas from supervised disentanglement and causal factorization, we aim to separate the domain parameters from the dynamics in the latent space of generative models.
Results indicate that disentangled VAEs adapt better to domain parameters spaces that were not present in the training data.
arXiv Detail & Related papers (2021-08-26T09:58:06Z) - Deep dynamic modeling with just two time points: Can we still allow for
individual trajectories? [0.0]
In epidemiological cohort studies and clinical registries Longitudinal biomedical data are often characterized by a sparse time grid.
Inspired by recent advances that allow to combine deep learning with dynamic modeling, we investigate whether such approaches can be useful for uncovering complex structure.
We show that such dynamic deep learning approaches can be useful even in extreme small data settings, but need to be carefully adapted.
arXiv Detail & Related papers (2020-12-01T16:58:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.