Category-based Galaxy Image Generation via Diffusion Models
- URL: http://arxiv.org/abs/2506.16255v1
- Date: Thu, 19 Jun 2025 12:14:33 GMT
- Title: Category-based Galaxy Image Generation via Diffusion Models
- Authors: Xingzhong Fan, Hongming Tang, Yue Zeng, M. B. N. Kouwenhoven, Guangquan Zeng,
- Abstract summary: We present GalCatDiff, the first framework in astronomy to leverage both galaxy image features and astrophysical properties in the network design of diffusion models.<n>GalCatDiff incorporates an enhanced U-Net and a novel block entitled Astro-RAB (Residual Attention Block), which dynamically combines attention mechanisms with convolution operations to ensure global consistency and local feature fidelity.<n>Our experimental results demonstrate that GalCatDiff significantly outperforms existing methods in terms of the consistency of sample color and size distributions, and the generated galaxies are both visually realistic and physically consistent.
- Score: 0.39945675027960637
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Conventional galaxy generation methods rely on semi-analytical models and hydrodynamic simulations, which are highly dependent on physical assumptions and parameter tuning. In contrast, data-driven generative models do not have explicit physical parameters pre-determined, and instead learn them efficiently from observational data, making them alternative solutions to galaxy generation. Among these, diffusion models outperform Variational Autoencoders (VAEs) and Generative Adversarial Networks (GANs) in quality and diversity. Leveraging physical prior knowledge to these models can further enhance their capabilities. In this work, we present GalCatDiff, the first framework in astronomy to leverage both galaxy image features and astrophysical properties in the network design of diffusion models. GalCatDiff incorporates an enhanced U-Net and a novel block entitled Astro-RAB (Residual Attention Block), which dynamically combines attention mechanisms with convolution operations to ensure global consistency and local feature fidelity. Moreover, GalCatDiff uses category embeddings for class-specific galaxy generation, avoiding the high computational costs of training separate models for each category. Our experimental results demonstrate that GalCatDiff significantly outperforms existing methods in terms of the consistency of sample color and size distributions, and the generated galaxies are both visually realistic and physically consistent. This framework will enhance the reliability of galaxy simulations and can potentially serve as a data augmentor to support future galaxy classification algorithm development.
Related papers
- A Generative Model for Disentangling Galaxy Photometric Parameters [1.8227840589648028]
We propose a Conditional AutoEncoder (CAE) framework to simultaneously model and characterize galaxy morphology.<n>Our CAE is trained on a suite of realistic mock galaxy images generated via GalSim, encompassing a broad range of galaxy types, photometric parameters, and observational conditions.<n>By encoding each galaxy image into a low-dimensional latent representation conditioned on key parameters, our model effectively recovers these morphological features in a disentangled manner, while also reconstructing the original image.
arXiv Detail & Related papers (2025-07-21T03:09:37Z) - Can AI Dream of Unseen Galaxies? Conditional Diffusion Model for Galaxy Morphology Augmentation [4.3933321767775135]
We propose a conditional diffusion model to synthesize realistic galaxy images for augmenting machine learning data.<n>We show that our model generates diverse, high-fidelity galaxy images closely adhere to the specified morphological feature conditions.<n>This model enables generative extrapolation to project well-annotated data into unseen domains and advancing rare object detection.
arXiv Detail & Related papers (2025-06-19T11:44:09Z) - UniGenX: Unified Generation of Sequence and Structure with Autoregressive Diffusion [61.690978792873196]
Existing approaches rely on either autoregressive sequence models or diffusion models.<n>We propose UniGenX, a unified framework that combines autoregressive next-token prediction with conditional diffusion models.<n>We validate the effectiveness of UniGenX on material and small molecule generation tasks.
arXiv Detail & Related papers (2025-03-09T16:43:07Z) - Conditional Diffusion-Flow models for generating 3D cosmic density fields: applications to f(R) cosmologies [0.0]
Next-generation galaxy surveys promise unprecedented precision in testing gravity at cosmological scales.<n>We explore conditional generative modelling to create 3D dark matter density fields via score-based (diffusion) and flow-based methods.
arXiv Detail & Related papers (2025-02-24T12:06:23Z) - Understanding Reinforcement Learning-Based Fine-Tuning of Diffusion Models: A Tutorial and Review [63.31328039424469]
This tutorial provides a comprehensive survey of methods for fine-tuning diffusion models to optimize downstream reward functions.
We explain the application of various RL algorithms, including PPO, differentiable optimization, reward-weighted MLE, value-weighted sampling, and path consistency learning.
arXiv Detail & Related papers (2024-07-18T17:35:32Z) - Diffusion-Based Neural Network Weights Generation [80.89706112736353]
D2NWG is a diffusion-based neural network weights generation technique that efficiently produces high-performing weights for transfer learning.
Our method extends generative hyper-representation learning to recast the latent diffusion paradigm for neural network weights generation.
Our approach is scalable to large architectures such as large language models (LLMs), overcoming the limitations of current parameter generation techniques.
arXiv Detail & Related papers (2024-02-28T08:34:23Z) - LtU-ILI: An All-in-One Framework for Implicit Inference in Astrophysics and Cosmology [1.5070941464775514]
This paper presents the Learning the Universe Implicit Likelihood Inference (LtU-ILI) pipeline.
It is a cutting-edge machine learning (ML) inference in astrophysics and cosmology.
We present real applications across a range of astrophysics and cosmology problems.
arXiv Detail & Related papers (2024-02-06T19:00:00Z) - Discovering Galaxy Features via Dataset Distillation [7.121183597915665]
In many applications, Neural Nets (NNs) have classification performance on par or even exceeding human capacity.
Here, we apply this idea to the notoriously difficult task of galaxy classification.
We present a novel way to summarize and visualize prototypical galaxy morphology through the lens of neural networks.
arXiv Detail & Related papers (2023-11-29T12:39:31Z) - A Survey on Generative Diffusion Model [75.93774014861978]
Diffusion models are an emerging class of deep generative models.
They have certain limitations, including a time-consuming iterative generation process and confinement to high-dimensional Euclidean space.
This survey presents a plethora of advanced techniques aimed at enhancing diffusion models.
arXiv Detail & Related papers (2022-09-06T16:56:21Z) - Supernova Light Curves Approximation based on Neural Network Models [53.180678723280145]
Photometric data-driven classification of supernovae becomes a challenge due to the appearance of real-time processing of big data in astronomy.
Recent studies have demonstrated the superior quality of solutions based on various machine learning models.
We study the application of multilayer perceptron (MLP), bayesian neural network (BNN), and normalizing flows (NF) to approximate observations for a single light curve.
arXiv Detail & Related papers (2022-06-27T13:46:51Z) - Learning High-Dimensional Distributions with Latent Neural Fokker-Planck
Kernels [67.81799703916563]
We introduce new techniques to formulate the problem as solving Fokker-Planck equation in a lower-dimensional latent space.
Our proposed model consists of latent-distribution morphing, a generator and a parameterized Fokker-Planck kernel function.
arXiv Detail & Related papers (2021-05-10T17:42:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.