Nonlinear time-series embedding by monotone variational inequality
- URL: http://arxiv.org/abs/2406.06894v1
- Date: Tue, 11 Jun 2024 02:19:31 GMT
- Title: Nonlinear time-series embedding by monotone variational inequality
- Authors: Jonathan Y. Zhou, Yao Xie,
- Abstract summary: We introduce a new method to learn low-dimensional representations of nonlinear time series without supervision.
The learned representation can be used for downstream machine-learning tasks such as clustering and classification.
- Score: 6.992239210938067
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In the wild, we often encounter collections of sequential data such as electrocardiograms, motion capture, genomes, and natural language, and sequences may be multichannel or symbolic with nonlinear dynamics. We introduce a new method to learn low-dimensional representations of nonlinear time series without supervision and can have provable recovery guarantees. The learned representation can be used for downstream machine-learning tasks such as clustering and classification. The method is based on the assumption that the observed sequences arise from a common domain, but each sequence obeys its own autoregressive models that are related to each other through low-rank regularization. We cast the problem as a computationally efficient convex matrix parameter recovery problem using monotone Variational Inequality and encode the common domain assumption via low-rank constraint across the learned representations, which can learn the geometry for the entire domain as well as faithful representations for the dynamics of each individual sequence using the domain information in totality. We show the competitive performance of our method on real-world time-series data with the baselines and demonstrate its effectiveness for symbolic text modeling and RNA sequence clustering.
Related papers
- A Poisson-Gamma Dynamic Factor Model with Time-Varying Transition Dynamics [51.147876395589925]
A non-stationary PGDS is proposed to allow the underlying transition matrices to evolve over time.
A fully-conjugate and efficient Gibbs sampler is developed to perform posterior simulation.
Experiments show that, in comparison with related models, the proposed non-stationary PGDS achieves improved predictive performance.
arXiv Detail & Related papers (2024-02-26T04:39:01Z) - Machine learning approach to detect dynamical states from recurrence measures [0.0]
We implement three machine learning algorithms Logistic Regression, Random Forest, and Support Vector Machine for this study.
For training and testing we generate synthetic data from standard nonlinear dynamical systems.
We illustrate how the trained algorithms can successfully predict the dynamical states of two variable stars, SX Her and AC Her.
arXiv Detail & Related papers (2024-01-18T05:02:36Z) - Efficient Interpretable Nonlinear Modeling for Multiple Time Series [5.448070998907116]
This paper proposes an efficient nonlinear modeling approach for multiple time series.
It incorporates nonlinear interactions among different time-series variables.
Experimental results show that the proposed algorithm improves the identification of the support of the VAR coefficients in a parsimonious manner.
arXiv Detail & Related papers (2023-09-29T11:42:59Z) - Learning Linear Causal Representations from Interventions under General
Nonlinear Mixing [52.66151568785088]
We prove strong identifiability results given unknown single-node interventions without access to the intervention targets.
This is the first instance of causal identifiability from non-paired interventions for deep neural network embeddings.
arXiv Detail & Related papers (2023-06-04T02:32:12Z) - Learning the Dynamics of Sparsely Observed Interacting Systems [0.6021787236982659]
We address the problem of learning the dynamics of an unknown non-parametric system linking a target and a feature time series.
By leveraging the rich theory of signatures, we are able to cast this non-linear problem as a high-dimensional linear regression.
arXiv Detail & Related papers (2023-01-27T10:48:28Z) - Chaos as an interpretable benchmark for forecasting and data-driven
modelling [7.6146285961466]
Chaotic systems pose a unique challenge to modern statistical learning techniques.
We present a database currently comprising 131 known chaotic dynamical systems spanning fields such as astrophysics, climatology, and biochemistry.
arXiv Detail & Related papers (2021-10-11T13:39:41Z) - Joint Network Topology Inference via Structured Fusion Regularization [70.30364652829164]
Joint network topology inference represents a canonical problem of learning multiple graph Laplacian matrices from heterogeneous graph signals.
We propose a general graph estimator based on a novel structured fusion regularization.
We show that the proposed graph estimator enjoys both high computational efficiency and rigorous theoretical guarantee.
arXiv Detail & Related papers (2021-03-05T04:42:32Z) - The Connection between Discrete- and Continuous-Time Descriptions of
Gaussian Continuous Processes [60.35125735474386]
We show that discretizations yielding consistent estimators have the property of invariance under coarse-graining'
This result explains why combining differencing schemes for derivatives reconstruction and local-in-time inference approaches does not work for time series analysis of second or higher order differential equations.
arXiv Detail & Related papers (2021-01-16T17:11:02Z) - Graph Gamma Process Generalized Linear Dynamical Systems [60.467040479276704]
We introduce graph gamma process (GGP) linear dynamical systems to model real multivariate time series.
For temporal pattern discovery, the latent representation under the model is used to decompose the time series into a parsimonious set of multivariate sub-sequences.
We use the generated random graph, whose number of nonzero-degree nodes is finite, to define both the sparsity pattern and dimension of the latent state transition matrix.
arXiv Detail & Related papers (2020-07-25T04:16:34Z) - Variational Hyper RNN for Sequence Modeling [69.0659591456772]
We propose a novel probabilistic sequence model that excels at capturing high variability in time series data.
Our method uses temporal latent variables to capture information about the underlying data pattern.
The efficacy of the proposed method is demonstrated on a range of synthetic and real-world sequential data.
arXiv Detail & Related papers (2020-02-24T19:30:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.