DYNOTEARS: Structure Learning from Time-Series Data
- URL: http://arxiv.org/abs/2002.00498v2
- Date: Mon, 27 Apr 2020 18:06:04 GMT
- Title: DYNOTEARS: Structure Learning from Time-Series Data
- Authors: Roxana Pamfil, Nisara Sriwattanaworachai, Shaan Desai, Philip
Pilgerstorfer, Paul Beaumont, Konstantinos Georgatzis, Bryon Aragam
- Abstract summary: We propose a method that simultaneously estimates contemporaneous (intra-slice) and time-lagged (inter-slice) relationships between variables in a time-series.
Compared to state-of-the-art methods for learning dynamic Bayesian networks, our method is both scalable and accurate on real data.
- Score: 6.7638850283606855
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We revisit the structure learning problem for dynamic Bayesian networks and
propose a method that simultaneously estimates contemporaneous (intra-slice)
and time-lagged (inter-slice) relationships between variables in a time-series.
Our approach is score-based, and revolves around minimizing a penalized loss
subject to an acyclicity constraint. To solve this problem, we leverage a
recent algebraic result characterizing the acyclicity constraint as a smooth
equality constraint. The resulting algorithm, which we call DYNOTEARS,
outperforms other methods on simulated data, especially in high-dimensions as
the number of variables increases. We also apply this algorithm on real
datasets from two different domains, finance and molecular biology, and analyze
the resulting output. Compared to state-of-the-art methods for learning dynamic
Bayesian networks, our method is both scalable and accurate on real data. The
simple formulation and competitive performance of our method make it suitable
for a variety of problems where one seeks to learn connections between
variables across time.
Related papers
- Switchable Decision: Dynamic Neural Generation Networks [98.61113699324429]
We propose a switchable decision to accelerate inference by dynamically assigning resources for each data instance.
Our method benefits from less cost during inference while keeping the same accuracy.
arXiv Detail & Related papers (2024-05-07T17:44:54Z) - TS-CausalNN: Learning Temporal Causal Relations from Non-linear Non-stationary Time Series Data [0.42156176975445486]
We propose a Time-Series Causal Neural Network (TS-CausalNN) to discover contemporaneous and lagged causal relations simultaneously.
In addition to the simple parallel design, an advantage of the proposed model is that it naturally handles the non-stationarity and non-linearity of the data.
arXiv Detail & Related papers (2024-04-01T20:33:29Z) - Diffeomorphic Transformations for Time Series Analysis: An Efficient
Approach to Nonlinear Warping [0.0]
The proliferation and ubiquity of temporal data across many disciplines has sparked interest for similarity, classification and clustering methods.
Traditional distance measures such as the Euclidean are not well-suited due to the time-dependent nature of the data.
This thesis proposes novel elastic alignment methods that use parametric & diffeomorphic warping transformations.
arXiv Detail & Related papers (2023-09-25T10:51:47Z) - Semi-supervised Learning of Partial Differential Operators and Dynamical
Flows [68.77595310155365]
We present a novel method that combines a hyper-network solver with a Fourier Neural Operator architecture.
We test our method on various time evolution PDEs, including nonlinear fluid flows in one, two, and three spatial dimensions.
The results show that the new method improves the learning accuracy at the time point of supervision point, and is able to interpolate and the solutions to any intermediate time.
arXiv Detail & Related papers (2022-07-28T19:59:14Z) - Deep Efficient Continuous Manifold Learning for Time Series Modeling [11.876985348588477]
A symmetric positive definite matrix is being studied in computer vision, signal processing, and medical image analysis.
In this paper, we propose a framework to exploit a diffeomorphism mapping between Riemannian manifold and a Cholesky space.
For dynamic modeling of time-series data, we devise a continuous manifold learning method by systematically integrating a manifold ordinary differential equation and a gated recurrent neural network.
arXiv Detail & Related papers (2021-12-03T01:38:38Z) - Simple Stochastic and Online Gradient DescentAlgorithms for Pairwise
Learning [65.54757265434465]
Pairwise learning refers to learning tasks where the loss function depends on a pair instances.
Online descent (OGD) is a popular approach to handle streaming data in pairwise learning.
In this paper, we propose simple and online descent to methods for pairwise learning.
arXiv Detail & Related papers (2021-11-23T18:10:48Z) - A Constraint-Based Algorithm for the Structural Learning of
Continuous-Time Bayesian Networks [70.88503833248159]
We propose the first constraint-based algorithm for learning the structure of continuous-time Bayesian networks.
We discuss the different statistical tests and the underlying hypotheses used by our proposal to establish conditional independence.
arXiv Detail & Related papers (2020-07-07T07:34:09Z) - An Online Method for A Class of Distributionally Robust Optimization
with Non-Convex Objectives [54.29001037565384]
We propose a practical online method for solving a class of online distributionally robust optimization (DRO) problems.
Our studies demonstrate important applications in machine learning for improving the robustness of networks.
arXiv Detail & Related papers (2020-06-17T20:19:25Z) - Dynamic Federated Learning [57.14673504239551]
Federated learning has emerged as an umbrella term for centralized coordination strategies in multi-agent environments.
We consider a federated learning model where at every iteration, a random subset of available agents perform local updates based on their data.
Under a non-stationary random walk model on the true minimizer for the aggregate optimization problem, we establish that the performance of the architecture is determined by three factors, namely, the data variability at each agent, the model variability across all agents, and a tracking term that is inversely proportional to the learning rate of the algorithm.
arXiv Detail & Related papers (2020-02-20T15:00:54Z) - Time Series Alignment with Global Invariances [14.632733235929926]
We propose a novel distance accounting both feature space and temporal variabilities by learning a latent global transformation of the feature space together with a temporal alignment.
We present two algorithms for the computation of time series barycenters under this new geometry.
We illustrate the interest of our approach on both simulated and real world data and show the robustness of our approach compared to state-of-the-art methods.
arXiv Detail & Related papers (2020-02-10T15:11:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.