Modeling Irregular Astronomical Time Series with Neural Stochastic Delay Differential Equations
- URL: http://arxiv.org/abs/2508.17521v1
- Date: Sun, 24 Aug 2025 21:06:32 GMT
- Title: Modeling Irregular Astronomical Time Series with Neural Stochastic Delay Differential Equations
- Authors: YongKyung Oh, Seungsu Kam, Dong-Young Lim, Sungil Kim,
- Abstract summary: We introduce a new framework based on Neural Delay Differential Equations (Neural SDDEs)<n>Our approach integrates a delay-aware neural architecture, a numerical solver for SDDEs, and mechanisms to robustly learn from noisy, sparse sequences.<n> Experiments on irregularly sampled astronomical data demonstrate strong classification accuracy and effective detection of novel astrophysical events, even with partial labels.
- Score: 13.404503606887717
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Astronomical time series from large-scale surveys like LSST are often irregularly sampled and incomplete, posing challenges for classification and anomaly detection. We introduce a new framework based on Neural Stochastic Delay Differential Equations (Neural SDDEs) that combines stochastic modeling with neural networks to capture delayed temporal dynamics and handle irregular observations. Our approach integrates a delay-aware neural architecture, a numerical solver for SDDEs, and mechanisms to robustly learn from noisy, sparse sequences. Experiments on irregularly sampled astronomical data demonstrate strong classification accuracy and effective detection of novel astrophysical events, even with partial labels. This work highlights Neural SDDEs as a principled and practical tool for time series analysis under observational constraints.
Related papers
- Neural MJD: Neural Non-Stationary Merton Jump Diffusion for Time Series Prediction [13.819057582932214]
We introduce Neural MJD, a neural network based non-stationary Merton diffusion (MJD) model.<n>Our model explicitly formulates forecasting as a Poisson equation (SDE) simulation problem.<n>To enable tractable learning, we introduce a likelihood truncation mechanism that caps the number of jumps within small time intervals.
arXiv Detail & Related papers (2025-06-05T01:23:28Z) - Detecting Anomalies in Dynamic Graphs via Memory enhanced Normality [39.476378833827184]
Anomaly detection in dynamic graphs presents a significant challenge due to the temporal evolution of graph structures and attributes.
We introduce a novel spatial- temporal memories-enhanced graph autoencoder (STRIPE)
STRIPE significantly outperforms existing methods with 5.8% improvement in AUC scores and 4.62X faster in training time.
arXiv Detail & Related papers (2024-03-14T02:26:10Z) - Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - Correlation-aware Spatial-Temporal Graph Learning for Multivariate
Time-series Anomaly Detection [67.60791405198063]
We propose a correlation-aware spatial-temporal graph learning (termed CST-GL) for time series anomaly detection.
CST-GL explicitly captures the pairwise correlations via a multivariate time series correlation learning module.
A novel anomaly scoring component is further integrated into CST-GL to estimate the degree of an anomaly in a purely unsupervised manner.
arXiv Detail & Related papers (2023-07-17T11:04:27Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Stacked Residuals of Dynamic Layers for Time Series Anomaly Detection [0.0]
We present an end-to-end differentiable neural network architecture to perform anomaly detection in multivariate time series.
The architecture is a cascade of dynamical systems designed to separate linearly predictable components of the signal.
The anomaly detector exploits the temporal structure of the prediction residuals to detect both isolated point anomalies and set-point changes.
arXiv Detail & Related papers (2022-02-25T01:50:22Z) - Consistency of mechanistic causal discovery in continuous-time using
Neural ODEs [85.7910042199734]
We consider causal discovery in continuous-time for the study of dynamical systems.
We propose a causal discovery algorithm based on penalized Neural ODEs.
arXiv Detail & Related papers (2021-05-06T08:48:02Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Crop Classification under Varying Cloud Cover with Neural Ordinary
Differential Equations [23.93148719731374]
State-of-the-art methods for crop classification rely on techniques that implicitly assume regular temporal spacing between observations.
We propose to use neural ordinary differential equations (NODEs) in combination with recurrent neural networks (RNNs) to classify crop types in irregularly spaced image sequences.
arXiv Detail & Related papers (2020-12-04T11:56:50Z) - Learning Continuous-Time Dynamics by Stochastic Differential Networks [32.63114111531396]
We propose a flexible continuous-time recurrent neural network named Variational Differential Networks (VSDN)
VSDN embeds the complicated dynamics of the sporadic time series by neural Differential Equations (SDE)
We show that VSDNs outperform state-of-the-art continuous-time deep learning models and achieve remarkable performance on prediction and tasks for sporadic time series.
arXiv Detail & Related papers (2020-06-11T01:40:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.