RotNet: Fast and Scalable Estimation of Stellar Rotation Periods Using
Convolutional Neural Networks
- URL: http://arxiv.org/abs/2012.01985v2
- Date: Fri, 4 Dec 2020 02:35:19 GMT
- Title: RotNet: Fast and Scalable Estimation of Stellar Rotation Periods Using
Convolutional Neural Networks
- Authors: J. Emmanuel Johnson, Sairam Sundaresan, Tansu Daylan, Lisseth Gavilan,
Daniel K. Giles, Stela Ishitani Silva, Anna Jungbluth, Brett Morris, Andr\'es
Mu\~noz-Jaramillo
- Abstract summary: We harness the power of deep learning to regress stellar rotation periods from Kepler light curves.
We benchmark our method against a random forest regressor, a 1D CNN, and the Auto-Correlation Function (ACF) - the current standard to estimate rotation periods.
- Score: 0.903415485511869
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Magnetic activity in stars manifests as dark spots on their surfaces that
modulate the brightness observed by telescopes. These light curves contain
important information on stellar rotation. However, the accurate estimation of
rotation periods is computationally expensive due to scarce ground truth
information, noisy data, and large parameter spaces that lead to degenerate
solutions. We harness the power of deep learning and successfully apply
Convolutional Neural Networks to regress stellar rotation periods from Kepler
light curves. Geometry-preserving time-series to image transformations of the
light curves serve as inputs to a ResNet-18 based architecture which is trained
through transfer learning. The McQuillan catalog of published rotation periods
is used as ansatz to groundtruth. We benchmark the performance of our method
against a random forest regressor, a 1D CNN, and the Auto-Correlation Function
(ACF) - the current standard to estimate rotation periods. Despite limiting our
input to fewer data points (1k), our model yields more accurate results and
runs 350 times faster than ACF runs on the same number of data points and
10,000 times faster than ACF runs on 65k data points. With only minimal feature
engineering our approach has impressive accuracy, motivating the application of
deep learning to regress stellar parameters on an even larger scale
Related papers
- Advancing Machine Learning for Stellar Activity and Exoplanet Period Rotation [0.3926357402982764]
This study applied machine learning models to estimate stellar rotation periods from corrected light curve data obtained by the NASA Kepler mission.
Traditional methods often struggle to estimate rotation periods accurately due to noise and variability in the light curve data.
We employed several machine learning algorithms, including Decision Tree, Random Forest, K-Nearest Neighbors, and Gradient Boosting, and also utilized a Voting Ensemble approach to improve prediction accuracy and robustness.
arXiv Detail & Related papers (2024-09-09T10:25:13Z) - PARE-Net: Position-Aware Rotation-Equivariant Networks for Robust Point Cloud Registration [8.668461141536383]
Learning rotation-invariant distinctive features is a fundamental requirement for point cloud registration.
Existing methods often use rotation-sensitive networks to extract features, while employing rotation augmentation to learn an approximate invariant mapping rudely.
We propose a novel position-aware rotation-equivariant network, for efficient, light-weighted, and robust registration.
arXiv Detail & Related papers (2024-07-14T10:26:38Z) - The Scaling Law in Stellar Light Curves [3.090476527764192]
We investigate the scaling law properties that emerge when learning from astronomical time series data using self-supervised techniques.
A self-supervised Transformer model achieves 3-10 times the sample efficiency compared to the state-of-the-art supervised learning model.
Our research lays the groundwork for analyzing stellar light curves by examining them through large-scale auto-regressive generative models.
arXiv Detail & Related papers (2024-05-27T13:31:03Z) - CRIN: Rotation-Invariant Point Cloud Analysis and Rotation Estimation
via Centrifugal Reference Frame [60.24797081117877]
We propose the CRIN, namely Centrifugal Rotation-Invariant Network.
CRIN directly takes the coordinates of points as input and transforms local points into rotation-invariant representations.
A continuous distribution for 3D rotations based on points is introduced.
arXiv Detail & Related papers (2023-03-06T13:14:10Z) - RAGO: Recurrent Graph Optimizer For Multiple Rotation Averaging [62.315673415889314]
This paper proposes a deep recurrent Rotation Averaging Graph (RAGO) for Multiple Rotation Averaging (MRA)
Our framework is a real-time learning-to-optimize rotation averaging graph with a tiny size deployed for real-world applications.
arXiv Detail & Related papers (2022-12-14T13:19:40Z) - Cosmology from Galaxy Redshift Surveys with PointNet [65.89809800010927]
In cosmology, galaxy redshift surveys resemble such a permutation invariant collection of positions in space.
We employ a textitPointNet-like neural network to regress the values of the cosmological parameters directly from point cloud data.
Our implementation of PointNets can analyse inputs of $mathcalO(104) - mathcalO(105)$ galaxies at a time, which improves upon earlier work for this application by roughly two orders of magnitude.
arXiv Detail & Related papers (2022-11-22T15:35:05Z) - SPE-Net: Boosting Point Cloud Analysis via Rotation Robustness
Enhancement [118.20816888815658]
We propose a novel deep architecture tailored for 3D point cloud applications, named as SPE-Net.
The embedded Selective Position variant' procedure relies on an attention mechanism that can effectively attend to the underlying rotation condition of the input.
We demonstrate the merits of the SPE-Net and the associated hypothesis on four benchmarks, showing evident improvements on both rotated and unrotated test data over SOTA methods.
arXiv Detail & Related papers (2022-11-15T15:59:09Z) - Supernova Light Curves Approximation based on Neural Network Models [53.180678723280145]
Photometric data-driven classification of supernovae becomes a challenge due to the appearance of real-time processing of big data in astronomy.
Recent studies have demonstrated the superior quality of solutions based on various machine learning models.
We study the application of multilayer perceptron (MLP), bayesian neural network (BNN), and normalizing flows (NF) to approximate observations for a single light curve.
arXiv Detail & Related papers (2022-06-27T13:46:51Z) - ART-Point: Improving Rotation Robustness of Point Cloud Classifiers via
Adversarial Rotation [89.47574181669903]
In this study, we show that the rotation robustness of point cloud classifiers can also be acquired via adversarial training.
Specifically, our proposed framework named ART-Point regards the rotation of the point cloud as an attack.
We propose a fast one-step optimization to efficiently reach the final robust model.
arXiv Detail & Related papers (2022-03-08T07:20:16Z) - Learning Rotation-Invariant Representations of Point Clouds Using
Aligned Edge Convolutional Neural Networks [29.3830445533532]
Point cloud analysis is an area of increasing interest due to the development of 3D sensors that are able to rapidly measure the depth of scenes accurately.
Applying deep learning techniques to perform point cloud analysis is non-trivial due to the inability of these methods to generalize to unseen rotations.
To address this limitation, one usually has to augment the training data, which can lead to extra computation and require larger model complexity.
This paper proposes a new neural network called the Aligned Edge Convolutional Neural Network (AECNN) that learns a feature representation of point clouds relative to Local Reference Frames (LRFs)
arXiv Detail & Related papers (2021-01-02T17:36:00Z) - Rotated Ring, Radial and Depth Wise Separable Radial Convolutions [13.481518628796692]
In this work, we address trainable rotation invariant convolutions and the construction of nets.
On the one hand, we show that our approach is rotationally invariant for different models and on different public data sets.
The rotationally adaptive convolution models presented are more computationally intensive than normal convolution models.
arXiv Detail & Related papers (2020-10-02T09:01:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.