The effect of phased recurrent units in the classification of multiple
catalogs of astronomical lightcurves
- URL: http://arxiv.org/abs/2106.03736v1
- Date: Mon, 7 Jun 2021 16:01:38 GMT
- Title: The effect of phased recurrent units in the classification of multiple
catalogs of astronomical lightcurves
- Authors: C. Donoso-Oliva, G. Cabrera-Vives, P. Protopapas, R. Carrasco-Davis,
and P.A. Estevez
- Abstract summary: We study the effectiveness of the LSTM and Phased LSTM based architectures for the classification of astronomical lightcurves.
Our findings show that LSTM outperformed PLSTM on 6/7 datasets.
- Score: 0.0
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: In the new era of very large telescopes, where data is crucial to expand
scientific knowledge, we have witnessed many deep learning applications for the
automatic classification of lightcurves. Recurrent neural networks (RNNs) are
one of the models used for these applications, and the LSTM unit stands out for
being an excellent choice for the representation of long time series. In
general, RNNs assume observations at discrete times, which may not suit the
irregular sampling of lightcurves. A traditional technique to address irregular
sequences consists of adding the sampling time to the network's input, but this
is not guaranteed to capture sampling irregularities during training.
Alternatively, the Phased LSTM unit has been created to address this problem by
updating its state using the sampling times explicitly. In this work, we study
the effectiveness of the LSTM and Phased LSTM based architectures for the
classification of astronomical lightcurves. We use seven catalogs containing
periodic and nonperiodic astronomical objects. Our findings show that LSTM
outperformed PLSTM on 6/7 datasets. However, the combination of both units
enhances the results in all datasets.
Related papers
- Solar Flare Prediction Using Long Short-term Memory (LSTM) and Decomposition-LSTM with Sliding Window Pattern Recognition [0.0]
dataset spans from 2003 to 2023 and includes 151,071 flare events.<n>sliding window technique is employed to detect temporal quasi-patterns in both irregular and regularized flare time series.<n>LSTM and DLSTM models are trained on sequences of peak flux and waiting times from irregular time series, while LSTM and DLSTM, integrated with an ensemble approach, are applied to sliding windows of regularized time series with a 3-hour interval.<n>DLSTM with an ensemble approach on regularized time series outperforms other models, offering more accurate large-flare forecasts with fewer false errors compared to models trained on irregular time series
arXiv Detail & Related papers (2025-07-07T13:17:38Z) - LSCD: Lomb-Scargle Conditioned Diffusion for Time series Imputation [55.800319453296886]
Time series with missing or irregularly sampled data are a persistent challenge in machine learning.<n>We introduce a different Lombiable--Scargle layer that enables a reliable computation of the power spectrum of irregularly sampled data.
arXiv Detail & Related papers (2025-06-20T14:48:42Z) - LLM-PS: Empowering Large Language Models for Time Series Forecasting with Temporal Patterns and Semantics [56.99021951927683]
Time Series Forecasting (TSF) is critical in many real-world domains like financial planning and health monitoring.<n>Existing Large Language Models (LLMs) usually perform suboptimally because they neglect the inherent characteristics of time series data.<n>We propose LLM-PS to empower the LLM for TSF by learning the fundamental textitPatterns and meaningful textitSemantics from time series data.
arXiv Detail & Related papers (2025-03-12T11:45:11Z) - DeLELSTM: Decomposition-based Linear Explainable LSTM to Capture
Instantaneous and Long-term Effects in Time Series [26.378073712630467]
We propose a Decomposition-based Linear Explainable LSTM (DeLELSTM) to improve the interpretability of LSTM.
We demonstrate the effectiveness and interpretability of DeLELSTM on three empirical datasets.
arXiv Detail & Related papers (2023-08-26T07:45:41Z) - Correlation-aware Spatial-Temporal Graph Learning for Multivariate
Time-series Anomaly Detection [67.60791405198063]
We propose a correlation-aware spatial-temporal graph learning (termed CST-GL) for time series anomaly detection.
CST-GL explicitly captures the pairwise correlations via a multivariate time series correlation learning module.
A novel anomaly scoring component is further integrated into CST-GL to estimate the degree of an anomaly in a purely unsupervised manner.
arXiv Detail & Related papers (2023-07-17T11:04:27Z) - Self-Supervised Learning for Time Series Analysis: Taxonomy, Progress, and Prospects [84.6945070729684]
Self-supervised learning (SSL) has recently achieved impressive performance on various time series tasks.
This article reviews current state-of-the-art SSL methods for time series data.
arXiv Detail & Related papers (2023-06-16T18:23:10Z) - A Bi-LSTM Autoencoder Framework for Anomaly Detection -- A Case Study of
a Wind Power Dataset [2.094022863940315]
Anomalies refer to data points or events that deviate from normal and homogeneous events.
This study presents a novel framework for time series anomaly detection using a combination of Bi-LSTM architecture and Autoencoder.
The Bi-LSTM Autoencoder model achieved a classification accuracy of 96.79% and outperformed more commonly used LSTM Autoencoder models.
arXiv Detail & Related papers (2023-03-17T00:24:28Z) - Image Classification using Sequence of Pixels [3.04585143845864]
This study compares sequential image classification methods based on recurrent neural networks.
We describe methods based on Long-Short-Term memory(LSTM), bidirectional Long-Short-Term memory(BiLSTM) architectures, etc.
arXiv Detail & Related papers (2022-09-23T09:42:44Z) - DCSF: Deep Convolutional Set Functions for Classification of
Asynchronous Time Series [5.339109578928972]
Asynchronous Time Series is a time series where all the channels are observed asynchronously-independently.
This paper proposes a novel framework, that is highly scalable and memory efficient, for the asynchronous time series classification task.
We explore convolutional neural networks, which are well researched for the closely related problem-classification of regularly sampled and fully observed time series.
arXiv Detail & Related papers (2022-08-24T08:47:36Z) - Learning Mixtures of Linear Dynamical Systems [94.49754087817931]
We develop a two-stage meta-algorithm to efficiently recover each ground-truth LDS model up to error $tildeO(sqrtd/T)$.
We validate our theoretical studies with numerical experiments, confirming the efficacy of the proposed algorithm.
arXiv Detail & Related papers (2022-01-26T22:26:01Z) - Deep Attention-Based Supernovae Classification of Multi-Band
Light-Curves [0.0]
supernovae (SNe) are relatively uncommon objects compared to other classes of variable events.
processing of multi-band light-curves is a challenging task due to the highly irregular cadence, long time gaps, missing-values, low number of observations.
We propose a Deep Attention model called TimeModAttn to classify multi-band light-curves of different SN types.
Second, we propose a model for the synthetic generation of SN multi-band light-curves based on the Supernova Parametric Model (SPM)
arXiv Detail & Related papers (2022-01-20T22:48:40Z) - PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive
Learning [109.84770951839289]
We present PredRNN, a new recurrent network for learning visual dynamics from historical context.
We show that our approach obtains highly competitive results on three standard datasets.
arXiv Detail & Related papers (2021-03-17T08:28:30Z) - Deep Cellular Recurrent Network for Efficient Analysis of Time-Series
Data with Spatial Information [52.635997570873194]
This work proposes a novel deep cellular recurrent neural network (DCRNN) architecture to process complex multi-dimensional time series data with spatial information.
The proposed architecture achieves state-of-the-art performance while utilizing substantially less trainable parameters when compared to comparable methods in the literature.
arXiv Detail & Related papers (2021-01-12T20:08:18Z) - Learning summary features of time series for likelihood free inference [93.08098361687722]
We present a data-driven strategy for automatically learning summary features from time series data.
Our results indicate that learning summary features from data can compete and even outperform LFI methods based on hand-crafted values.
arXiv Detail & Related papers (2020-12-04T19:21:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.