SELDON: Supernova Explosions Learned by Deep ODE Networks
- URL: http://arxiv.org/abs/2603.04392v1
- Date: Wed, 04 Mar 2026 18:57:21 GMT
- Title: SELDON: Supernova Explosions Learned by Deep ODE Networks
- Authors: Jiezhong Wu, Jack O'Brien, Jennifer Li, M. S. Krafczyk, Ved G. Shah, Amanda R. Wasserman, Daniel W. Apley, Gautham Narayan, Noelle I. Samia,
- Abstract summary: A continuous-time forecasting AI model is of interest because it can deliver millisecond-scale inference for thousands of objects per day.<n>In this paper, we propose SELDON, a new continuous-time variational autoencoder for panels of sparse and irregularly time-sampled (gappy) astrophysical light curves.
- Score: 0.06732575555508598
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The discovery rate of optical transients will explode to 10 million public alerts per night once the Vera C. Rubin Observatory's Legacy Survey of Space and Time comes online, overwhelming the traditional physics-based inference pipelines. A continuous-time forecasting AI model is of interest because it can deliver millisecond-scale inference for thousands of objects per day, whereas legacy MCMC codes need hours per object. In this paper, we propose SELDON, a new continuous-time variational autoencoder for panels of sparse and irregularly time-sampled (gappy) astrophysical light curves that are nonstationary, heteroscedastic, and inherently dependent. SELDON combines a masked GRU-ODE encoder with a latent neural ODE propagator and an interpretable Gaussian-basis decoder. The encoder learns to summarize panels of imbalanced and correlated data even when only a handful of points are observed. The neural ODE then integrates this hidden state forward in continuous time, extrapolating to future unseen epochs. This extrapolated time series is further encoded by deep sets to a latent distribution that is decoded to a weighted sum of Gaussian basis functions, the parameters of which are physically meaningful. Such parameters (e.g., rise time, decay rate, peak flux) directly drive downstream prioritization of spectroscopic follow-up for astrophysical surveys. Beyond astronomy, the architecture of SELDON offers a generic recipe for interpretable and continuous-time sequence modeling in any time domain where data are multivariate, sparse, heteroscedastic, and irregularly spaced.
Related papers
- QuaRK: A Quantum Reservoir Kernel for Time Series Learning [11.195918728130088]
QuaRK is an end-to-end framework that couples a hardware-realistic quantum reservoir featurizer with a kernel-based readout scheme.<n>We provide learning-theoretic guarantees for dependent temporal data, linking design and resource choices to finite-sample performance.
arXiv Detail & Related papers (2026-02-14T00:04:52Z) - Modeling Irregular Astronomical Time Series with Neural Stochastic Delay Differential Equations [13.404503606887717]
We introduce a new framework based on Neural Delay Differential Equations (Neural SDDEs)<n>Our approach integrates a delay-aware neural architecture, a numerical solver for SDDEs, and mechanisms to robustly learn from noisy, sparse sequences.<n> Experiments on irregularly sampled astronomical data demonstrate strong classification accuracy and effective detection of novel astrophysical events, even with partial labels.
arXiv Detail & Related papers (2025-08-24T21:06:32Z) - Multivariate Long-term Time Series Forecasting with Fourier Neural Filter [42.60778405812048]
We introduce FNF as the backbone and DBD as architecture to provide excellent learning capabilities and optimal learning pathways for spatial-temporal modeling.<n>We show that FNF unifies local time-domain and global frequency-domain information processing within a single backbone that extends naturally to spatial modeling.
arXiv Detail & Related papers (2025-06-10T18:40:20Z) - WaveGNN: Modeling Irregular Multivariate Time Series for Accurate Predictions [3.489870763747715]
Real-world time series often exhibit irregularities such as misaligned timestamps, missing entries, and variable sampling rates.<n>Existing approaches often rely on imputation, which can introduce biases.<n>We present WaveGNN, a novel framework designed to embed irregularly sampled time series data for accurate predictions.
arXiv Detail & Related papers (2024-12-14T00:03:44Z) - Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - Generative Modeling of Regular and Irregular Time Series Data via Koopman VAEs [50.25683648762602]
We introduce Koopman VAE, a new generative framework that is based on a novel design for the model prior.
Inspired by Koopman theory, we represent the latent conditional prior dynamics using a linear map.
KoVAE outperforms state-of-the-art GAN and VAE methods across several challenging synthetic and real-world time series generation benchmarks.
arXiv Detail & Related papers (2023-10-04T07:14:43Z) - ARFA: An Asymmetric Receptive Field Autoencoder Model for Spatiotemporal
Prediction [55.30913411696375]
We propose an Asymmetric Receptive Field Autoencoder (ARFA) model, which introduces corresponding sizes of receptive field modules.
In the encoder, we present large kernel module for globaltemporal feature extraction. In the decoder, we develop a small kernel module for localtemporal reconstruction.
We construct the RainBench, a large-scale radar echo dataset for precipitation prediction, to address the scarcity of meteorological data in the domain.
arXiv Detail & Related papers (2023-09-01T07:55:53Z) - Correlation-aware Spatial-Temporal Graph Learning for Multivariate
Time-series Anomaly Detection [67.60791405198063]
We propose a correlation-aware spatial-temporal graph learning (termed CST-GL) for time series anomaly detection.
CST-GL explicitly captures the pairwise correlations via a multivariate time series correlation learning module.
A novel anomaly scoring component is further integrated into CST-GL to estimate the degree of an anomaly in a purely unsupervised manner.
arXiv Detail & Related papers (2023-07-17T11:04:27Z) - A Probabilistic Autoencoder for Type Ia Supernovae Spectral Time Series [1.3316902142331577]
We construct a probabilistic autoencoder to learn the intrinsic diversity of type Ia supernovae (SNe Ia) from a sparse set of spectral time series.
We show that the PAE learns a low-dimensional latent space that captures the nonlinear range of features that exists within the population.
We then use our PAE in a number of downstream tasks on SNe Ia for increasingly precise cosmological analyses.
arXiv Detail & Related papers (2022-07-15T17:58:27Z) - Towards Generating Real-World Time Series Data [52.51620668470388]
We propose a novel generative framework for time series data generation - RTSGAN.
RTSGAN learns an encoder-decoder module which provides a mapping between a time series instance and a fixed-dimension latent vector.
To generate time series with missing values, we further equip RTSGAN with an observation embedding layer and a decide-and-generate decoder.
arXiv Detail & Related papers (2021-11-16T11:31:37Z) - Heteroscedastic Temporal Variational Autoencoder For Irregular Time Series [15.380441563675243]
We propose a new deep learning framework for irregularly sampled time series that we call the Heteroscedastic Temporal Variational Autoencoder (HeTVAE)
HeTVAE includes a novel input layer to encode information about input observation sparsity, a temporal VAE architecture to propagate uncertainty due to input sparsity, and a heteroscedastic output layer to enable variable uncertainty in output due to variables.
Our results show that the proposed architecture is better able to reflect variable uncertainty through time sparse and irregular sampling than a range of baseline and traditional models, as well as recently proposed deep latent variable models that use
arXiv Detail & Related papers (2021-07-23T16:59:21Z) - Adaptive Latent Space Tuning for Non-Stationary Distributions [62.997667081978825]
We present a method for adaptive tuning of the low-dimensional latent space of deep encoder-decoder style CNNs.
We demonstrate our approach for predicting the properties of a time-varying charged particle beam in a particle accelerator.
arXiv Detail & Related papers (2021-05-08T03:50:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.