DualDynamics: Synergizing Implicit and Explicit Methods for Robust Irregular Time Series Analysis
- URL: http://arxiv.org/abs/2401.04979v5
- Date: Fri, 21 Feb 2025 02:04:28 GMT
- Title: DualDynamics: Synergizing Implicit and Explicit Methods for Robust Irregular Time Series Analysis
- Authors: YongKyung Oh, Dong-Young Lim, Sungil Kim,
- Abstract summary: We introduce 'DualDynamics', a novel framework that combines NDE-based method and Neural Flow-based method.<n>This approach enhances expressive power while balancing computational demands, addressing critical limitations of existing techniques.<n>Our results show consistent out-performance over state-of-the-art methods, indicating DualDynamics' potential to advance irregular time series analysis significantly.
- Score: 3.686808512438363
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Real-world time series analysis faces significant challenges when dealing with irregular and incomplete data. While Neural Differential Equation (NDE) based methods have shown promise, they struggle with limited expressiveness, scalability issues, and stability concerns. Conversely, Neural Flows offer stability but falter with irregular data. We introduce 'DualDynamics', a novel framework that synergistically combines NDE-based method and Neural Flow-based method. This approach enhances expressive power while balancing computational demands, addressing critical limitations of existing techniques. We demonstrate DualDynamics' effectiveness across diverse tasks: classification of robustness to dataset shift, irregularly-sampled series analysis, interpolation of missing data, and forecasting with partial observations. Our results show consistent outperformance over state-of-the-art methods, indicating DualDynamics' potential to advance irregular time series analysis significantly.
Related papers
- Trajectory Flow Matching with Applications to Clinical Time Series Modeling [77.58277281319253]
Trajectory Flow Matching (TFM) trains a Neural SDE in a simulation-free manner, bypassing backpropagation through the dynamics.
We demonstrate improved performance on three clinical time series datasets in terms of absolute performance and uncertainty prediction.
arXiv Detail & Related papers (2024-10-28T15:54:50Z) - Stable Neural Stochastic Differential Equations in Analyzing Irregular Time Series Data [3.686808512438363]
Irregular sampling intervals and missing values in real-world time series data present challenges for conventional methods.
We propose three stable classes of Neural SDEs: Langevin-type SDE, Linear Noise SDE, and Geometric SDE.
Our results demonstrate the efficacy of the proposed method in handling real-world irregular time series data.
arXiv Detail & Related papers (2024-02-22T22:00:03Z) - Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - Predicting Ordinary Differential Equations with Transformers [65.07437364102931]
We develop a transformer-based sequence-to-sequence model that recovers scalar ordinary differential equations (ODEs) in symbolic form from irregularly sampled and noisy observations of a single solution trajectory.
Our method is efficiently scalable: after one-time pretraining on a large set of ODEs, we can infer the governing law of a new observed solution in a few forward passes of the model.
arXiv Detail & Related papers (2023-07-24T08:46:12Z) - Anamnesic Neural Differential Equations with Orthogonal Polynomial
Projections [6.345523830122166]
We propose PolyODE, a formulation that enforces long-range memory and preserves a global representation of the underlying dynamical system.
Our construction is backed by favourable theoretical guarantees and we demonstrate that it outperforms previous works in the reconstruction of past and future data.
arXiv Detail & Related papers (2023-03-03T10:49:09Z) - On Robust Numerical Solver for ODE via Self-Attention Mechanism [82.95493796476767]
We explore training efficient and robust AI-enhanced numerical solvers with a small data size by mitigating intrinsic noise disturbances.
We first analyze the ability of the self-attention mechanism to regulate noise in supervised learning and then propose a simple-yet-effective numerical solver, Attr, which introduces an additive self-attention mechanism to the numerical solution of differential equations.
arXiv Detail & Related papers (2023-02-05T01:39:21Z) - Semi-supervised Learning of Partial Differential Operators and Dynamical
Flows [68.77595310155365]
We present a novel method that combines a hyper-network solver with a Fourier Neural Operator architecture.
We test our method on various time evolution PDEs, including nonlinear fluid flows in one, two, and three spatial dimensions.
The results show that the new method improves the learning accuracy at the time point of supervision point, and is able to interpolate and the solutions to any intermediate time.
arXiv Detail & Related papers (2022-07-28T19:59:14Z) - Learning Sparse Nonlinear Dynamics via Mixed-Integer Optimization [3.7565501074323224]
We propose an exact formulation of the SINDyDy problem using mixed-integer optimization (MIO) to solve the sparsity constrained regression problem to provable optimality in seconds.
We illustrate the dramatic improvement of our approach in accurate model discovery while being more sample efficient, robust to noise, and flexible in accommodating physical constraints.
arXiv Detail & Related papers (2022-06-01T01:43:45Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - A Priori Denoising Strategies for Sparse Identification of Nonlinear
Dynamical Systems: A Comparative Study [68.8204255655161]
We investigate and compare the performance of several local and global smoothing techniques to a priori denoise the state measurements.
We show that, in general, global methods, which use the entire measurement data set, outperform local methods, which employ a neighboring data subset around a local point.
arXiv Detail & Related papers (2022-01-29T23:31:25Z) - Deep Efficient Continuous Manifold Learning for Time Series Modeling [11.876985348588477]
A symmetric positive definite matrix is being studied in computer vision, signal processing, and medical image analysis.
In this paper, we propose a framework to exploit a diffeomorphism mapping between Riemannian manifold and a Cholesky space.
For dynamic modeling of time-series data, we devise a continuous manifold learning method by systematically integrating a manifold ordinary differential equation and a gated recurrent neural network.
arXiv Detail & Related papers (2021-12-03T01:38:38Z) - Compositional Modeling of Nonlinear Dynamical Systems with ODE-based
Random Features [0.0]
We present a novel, domain-agnostic approach to tackling this problem.
We use compositions of physics-informed random features, derived from ordinary differential equations.
We find that our approach achieves comparable performance to a number of other probabilistic models on benchmark regression tasks.
arXiv Detail & Related papers (2021-06-10T17:55:13Z) - Almost Surely Stable Deep Dynamics [4.199844472131922]
We introduce a method for learning provably stable deep neural network based dynamic models from observed data.
Our method works by embedding a Lyapunov neural network into the dynamic model, thereby inherently satisfying the stability criterion.
arXiv Detail & Related papers (2021-03-26T20:37:08Z) - Training Generative Adversarial Networks by Solving Ordinary
Differential Equations [54.23691425062034]
We study the continuous-time dynamics induced by GAN training.
From this perspective, we hypothesise that instabilities in training GANs arise from the integration error.
We experimentally verify that well-known ODE solvers (such as Runge-Kutta) can stabilise training.
arXiv Detail & Related papers (2020-10-28T15:23:49Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - STEER: Simple Temporal Regularization For Neural ODEs [80.80350769936383]
We propose a new regularization technique: randomly sampling the end time of the ODE during training.
The proposed regularization is simple to implement, has negligible overhead and is effective across a wide variety of tasks.
We show through experiments on normalizing flows, time series models and image recognition that the proposed regularization can significantly decrease training time and even improve performance over baseline models.
arXiv Detail & Related papers (2020-06-18T17:44:50Z) - Learning Continuous-Time Dynamics by Stochastic Differential Networks [32.63114111531396]
We propose a flexible continuous-time recurrent neural network named Variational Differential Networks (VSDN)
VSDN embeds the complicated dynamics of the sporadic time series by neural Differential Equations (SDE)
We show that VSDNs outperform state-of-the-art continuous-time deep learning models and achieve remarkable performance on prediction and tasks for sporadic time series.
arXiv Detail & Related papers (2020-06-11T01:40:34Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.