StellarF: A Lora-Adapter Integrated Large Model Framework for Stellar Flare Forecasting with Historical & Statistical Data
- URL: http://arxiv.org/abs/2507.10986v1
- Date: Tue, 15 Jul 2025 04:59:22 GMT
- Title: StellarF: A Lora-Adapter Integrated Large Model Framework for Stellar Flare Forecasting with Historical & Statistical Data
- Authors: Tianyu Su, Zhiqiang Zou, Ali Luo, Xiao Kong, Qingyu Lu, Min Li,
- Abstract summary: This study introduces StellarF (Stellar Flare Forecasting), a novel large model for stellar flare forecasting.<n>At its core, StellarF integrates an flare statistical information module with a historical flare record module, enabling multi-scale pattern recognition from observational data.<n>The proposed prediction paradigm establishes a novel methodological framework for advancing astrophysical research and cross-disciplinary applications.
- Score: 3.901857423144103
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Stellar flare forecasting, a critical research frontier in astronomy, offers profound insights into stellar activity. However, the field is constrained by both the sparsity of recorded flare events and the absence of domain-specific large-scale predictive models. To address these challenges, this study introduces StellarF (Stellar Flare Forecasting), a novel large model that leverages Low-Rank (LoRA) and Adapter techniques to parameter-efficient learning for stellar flare forecasting. At its core, StellarF integrates an flare statistical information module with a historical flare record module, enabling multi-scale pattern recognition from observational data. Extensive experiments on our self-constructed datasets (derived from Kepler and TESS light curves) demonstrate that StellarF achieves state-of-the-art performance compared to existing methods. The proposed prediction paradigm establishes a novel methodological framework for advancing astrophysical research and cross-disciplinary applications.
Related papers
- STAR: A Benchmark for Astronomical Star Fields Super-Resolution [51.79340280382437]
We propose STAR, a large-scale astronomical SR dataset containing 54,738 flux-consistent star field image pairs.<n>We propose a Flux-Invariant Super Resolution (FISR) model that could accurately infer the flux-consistent high-resolution images from input photometry.
arXiv Detail & Related papers (2025-07-22T09:28:28Z) - FLARE: A Framework for Stellar Flare Forecasting using Stellar Physical Properties and Historical Records [24.00351327243306]
We introduce FLARE, the first-of-its-kind model specifically designed for stellar flare forecasting.<n>Our experiments on the publicly available Kepler light curve dataset demonstrate that FLARE achieves superior performance compared to other methods across all evaluation metrics.
arXiv Detail & Related papers (2025-02-25T14:03:15Z) - MambaDS: Near-Surface Meteorological Field Downscaling with Topography Constrained Selective State Space Modeling [68.69647625472464]
Downscaling, a crucial task in meteorological forecasting, enables the reconstruction of high-resolution meteorological states for target regions.
Previous downscaling methods lacked tailored designs for meteorology and encountered structural limitations.
We propose a novel model called MambaDS, which enhances the utilization of multivariable correlations and topography information.
arXiv Detail & Related papers (2024-08-20T13:45:49Z) - SFANet: Spatial-Frequency Attention Network for Weather Forecasting [54.470205739015434]
Weather forecasting plays a critical role in various sectors, driving decision-making and risk management.
Traditional methods often struggle to capture the complex dynamics of meteorological systems.
We propose a novel framework designed to address these challenges and enhance the accuracy of weather prediction.
arXiv Detail & Related papers (2024-05-29T08:00:15Z) - The Scaling Law in Stellar Light Curves [3.090476527764192]
We investigate the scaling law properties that emerge when learning from astronomical time series data using self-supervised techniques.
A self-supervised Transformer model achieves 3-10 times the sample efficiency compared to the state-of-the-art supervised learning model.
Our research lays the groundwork for analyzing stellar light curves by examining them through large-scale auto-regressive generative models.
arXiv Detail & Related papers (2024-05-27T13:31:03Z) - A Foundation Model for the Earth System [82.73624748093333]
We introduce Aurora, a large-scale foundation model for the Earth system trained on over a million hours of diverse data.
Aurora outperforms operational forecasts for air quality, ocean waves, tropical cyclone tracks, and high-resolution weather forecasting at orders of magnitude smaller computational expense than dedicated existing systems.
arXiv Detail & Related papers (2024-05-20T14:45:18Z) - Observation-Guided Meteorological Field Downscaling at Station Scale: A
Benchmark and a New Method [66.80344502790231]
We extend meteorological downscaling to arbitrary scattered station scales and establish a new benchmark and dataset.
Inspired by data assimilation techniques, we integrate observational data into the downscaling process, providing multi-scale observational priors.
Our proposed method outperforms other specially designed baseline models on multiple surface variables.
arXiv Detail & Related papers (2024-01-22T14:02:56Z) - Stellar Spectra Fitting with Amortized Neural Posterior Estimation and
nbi [0.0]
We train an ANPE model for the APOGEE survey and demonstrate its efficacy on both mock and real stellar spectra.
We introduce an effective approach to handling the measurement noise properties inherent in spectral data.
We discuss the utility of an ANPE "model zoo," where models are trained for specific instruments and distributed under the nbi framework.
arXiv Detail & Related papers (2023-12-09T21:30:07Z) - Learning Robust Precipitation Forecaster by Temporal Frame Interpolation [65.5045412005064]
We develop a robust precipitation forecasting model that demonstrates resilience against spatial-temporal discrepancies.
Our approach has led to significant improvements in forecasting precision, culminating in our model securing textit1st place in the transfer learning leaderboard of the textitWeather4cast'23 competition.
arXiv Detail & Related papers (2023-11-30T08:22:08Z) - A Novel Application of Conditional Normalizing Flows: Stellar Age
Inference with Gyrochronology [0.0]
We show that a data-driven approach can constrain gyrochronological ages with a precision comparable to other standard techniques.
This work demonstrates the potential of a probabilistic data-driven solution to widen the applicability of gyrochronological stellar dating.
arXiv Detail & Related papers (2023-07-17T18:00:19Z) - Astroconformer: Inferring Surface Gravity of Stars from Stellar Light
Curves with Transformer [1.122225892380515]
We introduce Astroconformer, a Transformer-based model to analyze stellar light curves from the Kepler mission.
We demonstrate that Astrconformer can robustly infer the stellar surface gravity as a supervised task.
We also show that the method can generalize to sparse cadence light curves from the Rubin Observatory.
arXiv Detail & Related papers (2022-07-06T16:22:37Z) - Deep Learning Models of the Discrete Component of the Galactic
Interstellar Gamma-Ray Emission [61.26321023273399]
A significant point-like component from the small scale (or discrete) structure in the H2 interstellar gas might be present in the Fermi-LAT data.
We show that deep learning may be effectively employed to model the gamma-ray emission traced by these rare H2 proxies within statistical significance in data-rich regions.
arXiv Detail & Related papers (2022-06-06T18:00:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.