Template-Fitting Meets Deep Learning: Redshift Estimation Using Physics-Guided Neural Networks
- URL: http://arxiv.org/abs/2507.00866v1
- Date: Tue, 01 Jul 2025 15:29:45 GMT
- Title: Template-Fitting Meets Deep Learning: Redshift Estimation Using Physics-Guided Neural Networks
- Authors: Jonas Chris Ferrao, Dickson Dias, Pranav Naik, Glory D'Cruz, Anish Naik, Siya Khandeparkar, Manisha Gokuldas Fal Dessai,
- Abstract summary: We present a hybrid method that integrates template fitting with deep learning using physics-guided neural networks.<n>We evaluate our model on the publicly available PREML dataset, which includes approximately 400,000 galaxies.<n>Our approach achieves an RMS error of 0.0507, a 3-sigma catastrophic rate of 0.13%, and a bias of 0.0028.
- Score: 0.4416697929169138
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Accurate photometric redshift estimation is critical for observational cosmology, especially in large-scale surveys where spectroscopic measurements are impractical. Traditional approaches include template fitting and machine learning, each with distinct strengths and limitations. We present a hybrid method that integrates template fitting with deep learning using physics-guided neural networks. By embedding spectral energy distribution templates into the network architecture, our model encodes physical priors into the training process. The system employs a multimodal design, incorporating cross-attention mechanisms to fuse photometric and image data, along with Bayesian layers for uncertainty estimation. We evaluate our model on the publicly available PREML dataset, which includes approximately 400,000 galaxies from the Hyper Suprime-Cam PDR3 release, with 5-band photometry, multi-band imaging, and spectroscopic redshifts. Our approach achieves an RMS error of 0.0507, a 3-sigma catastrophic outlier rate of 0.13%, and a bias of 0.0028. The model satisfies two of the three LSST photometric redshift requirements for redshifts below 3. These results highlight the potential of combining physically motivated templates with data-driven models for robust redshift estimation in upcoming cosmological surveys.
Related papers
- FreqCross: A Multi-Modal Frequency-Spatial Fusion Network for Robust Detection of Stable Diffusion 3.5 Generated Images [4.524282351757178]
FreqCross is a novel multi-modal fusion network that combines spatial RGB features, frequency domain artifacts, and radial energy distribution patterns.<n>Experiments on a dataset of 10,000 paired real (MS-COCO) and synthetic (Stable Diffusion 3.5) images demonstrate that FreqCross achieves 97.8% accuracy.
arXiv Detail & Related papers (2025-07-01T22:12:35Z) - Mantis Shrimp: Exploring Photometric Band Utilization in Computer Vision Networks for Photometric Redshift Estimation [0.30924355683504173]
We present a model for photometric redshift estimation that fuses ultra-violet (GALEX), optical (PanSTARRS), and infrared (UnWISE) imagery.<n>Mantis Shrimp estimates the conditional density estimate of redshift using cutout images.<n>We study how the models learn to use information across bands, finding evidence that our models successfully incorporates information from all surveys.
arXiv Detail & Related papers (2025-01-15T19:46:23Z) - SpectralGPT: Spectral Remote Sensing Foundation Model [60.023956954916414]
A universal RS foundation model, named SpectralGPT, is purpose-built to handle spectral RS images using a novel 3D generative pretrained transformer (GPT)
Compared to existing foundation models, SpectralGPT accommodates input images with varying sizes, resolutions, time series, and regions in a progressive training fashion, enabling full utilization of extensive RS big data.
Our evaluation highlights significant performance improvements with pretrained SpectralGPT models, signifying substantial potential in advancing spectral RS big data applications within the field of geoscience.
arXiv Detail & Related papers (2023-11-13T07:09:30Z) - Photo-zSNthesis: Converting Type Ia Supernova Lightcurves to Redshift
Estimates via Deep Learning [0.0]
Photo-zSNthesis is a convolutional neural network-based method for predicting full redshift probability distributions.
We show a 61x improvement in prediction bias Delta z> on PLAsTiCC simulations and 5x improvement on real SDSS data.
arXiv Detail & Related papers (2023-05-19T17:59:00Z) - Continuous time recurrent neural networks: overview and application to
forecasting blood glucose in the intensive care unit [56.801856519460465]
Continuous time autoregressive recurrent neural networks (CTRNNs) are a deep learning model that account for irregular observations.
We demonstrate the application of these models to probabilistic forecasting of blood glucose in a critical care setting.
arXiv Detail & Related papers (2023-04-14T09:39:06Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Multi-View Photometric Stereo Revisited [100.97116470055273]
Multi-view photometric stereo (MVPS) is a preferred method for detailed and precise 3D acquisition of an object from images.
We present a simple, practical approach to MVPS, which works well for isotropic as well as other object material types such as anisotropic and glossy.
The proposed approach shows state-of-the-art results when tested extensively on several benchmark datasets.
arXiv Detail & Related papers (2022-10-14T09:46:15Z) - Neural network enhanced measurement efficiency for molecular
groundstates [63.36515347329037]
We adapt common neural network models to learn complex groundstate wavefunctions for several molecular qubit Hamiltonians.
We find that using a neural network model provides a robust improvement over using single-copy measurement outcomes alone to reconstruct observables.
arXiv Detail & Related papers (2022-06-30T17:45:05Z) - Photometric Redshift Estimation with Convolutional Neural Networks and
Galaxy Images: A Case Study of Resolving Biases in Data-Driven Methods [0.0]
We investigate two major forms of biases, i.e., class-dependent residuals and mode collapse, in a case study of estimating photometric redshifts.
We propose a set of consecutive steps for resolving the two biases based on CNN models.
Experiments show that our methods possess a better capability in controlling biases compared to benchmark methods.
arXiv Detail & Related papers (2022-02-21T02:59:33Z) - Physical model simulator-trained neural network for computational 3D
phase imaging of multiple-scattering samples [1.112751058850223]
We develop a new model-based data normalization pre-processing procedure for homogenizing the sample contrast.
We demonstrate this framework's capabilities on experimental measurements of epithelial buccal cells and Caenorhabditis elegans worms.
arXiv Detail & Related papers (2021-03-29T17:43:56Z) - Kernel and Rich Regimes in Overparametrized Models [69.40899443842443]
We show that gradient descent on overparametrized multilayer networks can induce rich implicit biases that are not RKHS norms.
We also demonstrate this transition empirically for more complex matrix factorization models and multilayer non-linear networks.
arXiv Detail & Related papers (2020-02-20T15:43:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.