STRODE: Stochastic Boundary Ordinary Differential Equation
- URL: http://arxiv.org/abs/2107.08273v1
- Date: Sat, 17 Jul 2021 16:25:46 GMT
- Title: STRODE: Stochastic Boundary Ordinary Differential Equation
- Authors: Hengguan Huang, Hongfu Liu, Hao Wang, Chang Xiao and Ye Wang
- Abstract summary: Most algorithms for time-series modeling fail to learn dynamics of random event timings directly from visual or audio inputs.
We present a probabilistic ordinary differential equation (ODE) that learns both the timings and the dynamics of time series data without requiring any timing annotations during training.
Our results show that our approach successfully infers event timings of time series data.
- Score: 30.237665903943963
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Perception of time from sequentially acquired sensory inputs is rooted in
everyday behaviors of individual organisms. Yet, most algorithms for
time-series modeling fail to learn dynamics of random event timings directly
from visual or audio inputs, requiring timing annotations during training that
are usually unavailable for real-world applications. For instance, neuroscience
perspectives on postdiction imply that there exist variable temporal ranges
within which the incoming sensory inputs can affect the earlier perception, but
such temporal ranges are mostly unannotated for real applications such as
automatic speech recognition (ASR). In this paper, we present a probabilistic
ordinary differential equation (ODE), called STochastic boundaRy ODE (STRODE),
that learns both the timings and the dynamics of time series data without
requiring any timing annotations during training. STRODE allows the usage of
differential equations to sample from the posterior point processes,
efficiently and analytically. We further provide theoretical guarantees on the
learning of STRODE. Our empirical results show that our approach successfully
infers event timings of time series data. Our method achieves competitive or
superior performances compared to existing state-of-the-art methods for both
synthetic and real-world datasets.
Related papers
- SONNET: Enhancing Time Delay Estimation by Leveraging Simulated Audio [17.811771707446926]
We show that learning based methods can, even based on synthetic data, significantly outperform GCC-PHAT on novel real world data.
We provide our trained model, SONNET, which is runnable in real-time and works on novel data out of the box for many real data applications.
arXiv Detail & Related papers (2024-11-20T10:23:21Z) - Motion Code: Robust Time Series Classification and Forecasting via Sparse Variational Multi-Stochastic Processes Learning [3.2857981869020327]
We propose a novel framework that views each time series as a realization of a continuous-time process.
This mathematical approach captures dependencies across timestamps and detects hidden, time-varying signals within the noise.
Experiments on noisy datasets, including real-world Parkinson's disease sensor tracking, demonstrate Motion Code's strong performance against established benchmarks.
arXiv Detail & Related papers (2024-02-21T19:10:08Z) - Foundational Inference Models for Dynamical Systems [5.549794481031468]
We offer a fresh perspective on the classical problem of imputing missing time series data, whose underlying dynamics are assumed to be determined by ODEs.
We propose a novel supervised learning framework for zero-shot time series imputation, through parametric functions satisfying some (hidden) ODEs.
We empirically demonstrate that one and the same (pretrained) recognition model can perform zero-shot imputation across 63 distinct time series with missing values.
arXiv Detail & Related papers (2024-02-12T11:48:54Z) - Generative Modeling of Regular and Irregular Time Series Data via Koopman VAEs [50.25683648762602]
We introduce Koopman VAE, a new generative framework that is based on a novel design for the model prior.
Inspired by Koopman theory, we represent the latent conditional prior dynamics using a linear map.
KoVAE outperforms state-of-the-art GAN and VAE methods across several challenging synthetic and real-world time series generation benchmarks.
arXiv Detail & Related papers (2023-10-04T07:14:43Z) - Exact Inference for Continuous-Time Gaussian Process Dynamics [6.941863788146731]
In practice, the true system is often unknown and has to be learned from measurement data.
Most methods in Gaussian process (GP) dynamics model learning are trained on one-step ahead predictions.
We show how to derive flexible inference schemes for these types of evaluations.
arXiv Detail & Related papers (2023-09-05T16:07:00Z) - DIVERSIFY: A General Framework for Time Series Out-of-distribution
Detection and Generalization [58.704753031608625]
Time series is one of the most challenging modalities in machine learning research.
OOD detection and generalization on time series tend to suffer due to its non-stationary property.
We propose DIVERSIFY, a framework for OOD detection and generalization on dynamic distributions of time series.
arXiv Detail & Related papers (2023-08-04T12:27:11Z) - Encoding Time-Series Explanations through Self-Supervised Model Behavior
Consistency [26.99599329431296]
We present TimeX, a time series consistency model for training explainers.
TimeX trains an interpretable surrogate to mimic the behavior of a pretrained time series model.
We evaluate TimeX on eight synthetic and real-world datasets and compare its performance against state-of-the-art interpretability methods.
arXiv Detail & Related papers (2023-06-03T13:25:26Z) - Continuous-Time Modeling of Counterfactual Outcomes Using Neural
Controlled Differential Equations [84.42837346400151]
Estimating counterfactual outcomes over time has the potential to unlock personalized healthcare.
Existing causal inference approaches consider regular, discrete-time intervals between observations and treatment decisions.
We propose a controllable simulation environment based on a model of tumor growth for a range of scenarios.
arXiv Detail & Related papers (2022-06-16T17:15:15Z) - Consistency of mechanistic causal discovery in continuous-time using
Neural ODEs [85.7910042199734]
We consider causal discovery in continuous-time for the study of dynamical systems.
We propose a causal discovery algorithm based on penalized Neural ODEs.
arXiv Detail & Related papers (2021-05-06T08:48:02Z) - Learning summary features of time series for likelihood free inference [93.08098361687722]
We present a data-driven strategy for automatically learning summary features from time series data.
Our results indicate that learning summary features from data can compete and even outperform LFI methods based on hand-crafted values.
arXiv Detail & Related papers (2020-12-04T19:21:37Z) - STEER: Simple Temporal Regularization For Neural ODEs [80.80350769936383]
We propose a new regularization technique: randomly sampling the end time of the ODE during training.
The proposed regularization is simple to implement, has negligible overhead and is effective across a wide variety of tasks.
We show through experiments on normalizing flows, time series models and image recognition that the proposed regularization can significantly decrease training time and even improve performance over baseline models.
arXiv Detail & Related papers (2020-06-18T17:44:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.