Deep Dynamic Probabilistic Canonical Correlation Analysis
- URL: http://arxiv.org/abs/2502.05155v1
- Date: Fri, 07 Feb 2025 18:37:57 GMT
- Title: Deep Dynamic Probabilistic Canonical Correlation Analysis
- Authors: Shiqin Tang, Shujian Yu, Yining Dong, S. Joe Qin,
- Abstract summary: Deep Dynamic Probabilistic Canonical Correlation Analysis (D2PCCA) is a model that integrates deep learning with probabilistic modeling to analyze nonlinear dynamical systems.<n>Building on the probabilistic extensions of Canonical Correlation Analysis (CCA), D2PCCA captures nonlinear latent dynamics.<n>D2PCCA naturally extends to multiple observed variables, making it a versatile tool for encoding prior knowledge about sequential datasets.
- Score: 16.82419839795058
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper presents Deep Dynamic Probabilistic Canonical Correlation Analysis (D2PCCA), a model that integrates deep learning with probabilistic modeling to analyze nonlinear dynamical systems. Building on the probabilistic extensions of Canonical Correlation Analysis (CCA), D2PCCA captures nonlinear latent dynamics and supports enhancements such as KL annealing for improved convergence and normalizing flows for a more flexible posterior approximation. D2PCCA naturally extends to multiple observed variables, making it a versatile tool for encoding prior knowledge about sequential datasets and providing a probabilistic understanding of the system's dynamics. Experimental validation on real financial datasets demonstrates the effectiveness of D2PCCA and its extensions in capturing latent dynamics.
Related papers
- InfoDPCCA: Information-Theoretic Dynamic Probabilistic Canonical Correlation Analysis [20.656410520966986]
InfoDPCCA is a framework designed to model two interdependent sequences of observations.<n>We introduce a two-step training scheme to bridge the gap between information-theoretic representation learning and generative modeling.<n>We demonstrate that InfoDPCCA excels as a tool for representation learning.
arXiv Detail & Related papers (2025-06-10T15:13:48Z) - Probabilistic Decomposed Linear Dynamical Systems for Robust Discovery of Latent Neural Dynamics [5.841659874892801]
Time-varying linear state-space models are powerful tools for obtaining mathematically interpretable representations of neural signals.
Existing methods for latent variable estimation are not robust to dynamical noise and system nonlinearity.
We propose a probabilistic approach to latent variable estimation in decomposed models that improves robustness against dynamical noise.
arXiv Detail & Related papers (2024-08-29T18:58:39Z) - Modeling Latent Neural Dynamics with Gaussian Process Switching Linear Dynamical Systems [2.170477444239546]
We develop an approach that balances these two objectives: the Gaussian Process Switching Linear Dynamical System (gpSLDS)<n>Our method builds on previous work modeling the latent state evolution via a differential equation whose nonlinear dynamics are described by a Gaussian process (GP-SDEs)<n>Our approach resolves key limitations of the rSLDS such as artifactual oscillations in dynamics near discrete state boundaries, while also providing posterior uncertainty estimates of the dynamics.
arXiv Detail & Related papers (2024-07-19T15:32:15Z) - Physically Analyzable AI-Based Nonlinear Platoon Dynamics Modeling During Traffic Oscillation: A Koopman Approach [4.379212829795889]
There exists a critical need for a modeling methodology with high accuracy while concurrently achieving physical analyzability.
This paper proposes an AI-based Koopman approach to model the unknown nonlinear platoon dynamics harnessing the power of AI.
arXiv Detail & Related papers (2024-06-20T19:35:21Z) - eXponential FAmily Dynamical Systems (XFADS): Large-scale nonlinear Gaussian state-space modeling [9.52474299688276]
We introduce a low-rank structured variational autoencoder framework for nonlinear state-space graphical models.
We show that our approach consistently demonstrates the ability to learn a more predictive generative model.
arXiv Detail & Related papers (2024-03-03T02:19:49Z) - Data-Driven Characterization of Latent Dynamics on Quantum Testbeds [0.23408308015481663]
We augment the dynamical equation of quantum systems described by the Lindblad master equation with a parameterized source term.
We consider a structure preserving augmentation that learns and distinguishes unitary from dissipative latent dynamics parameterized by a basis of linear operators.
We demonstrate that our interpretable, structure preserving, and nonlinear models are able to improve the prediction accuracy of the Lindblad master equation.
arXiv Detail & Related papers (2024-01-18T09:28:44Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Latent Variable Representation for Reinforcement Learning [131.03944557979725]
It remains unclear theoretically and empirically how latent variable models may facilitate learning, planning, and exploration to improve the sample efficiency of model-based reinforcement learning.
We provide a representation view of the latent variable models for state-action value functions, which allows both tractable variational learning algorithm and effective implementation of the optimism/pessimism principle.
In particular, we propose a computationally efficient planning algorithm with UCB exploration by incorporating kernel embeddings of latent variable models.
arXiv Detail & Related papers (2022-12-17T00:26:31Z) - Dynamically-Scaled Deep Canonical Correlation Analysis [77.34726150561087]
Canonical Correlation Analysis (CCA) is a method for feature extraction of two views by finding maximally correlated linear projections of them.
We introduce a novel dynamic scaling method for training an input-dependent canonical correlation model.
arXiv Detail & Related papers (2022-03-23T12:52:49Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - Convex Analysis of the Mean Field Langevin Dynamics [49.66486092259375]
convergence rate analysis of the mean field Langevin dynamics is presented.
$p_q$ associated with the dynamics allows us to develop a convergence theory parallel to classical results in convex optimization.
arXiv Detail & Related papers (2022-01-25T17:13:56Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.