Implicit Augmentation from Distributional Symmetry in Turbulence Super-Resolution
- URL: http://arxiv.org/abs/2509.20683v1
- Date: Thu, 25 Sep 2025 02:31:56 GMT
- Title: Implicit Augmentation from Distributional Symmetry in Turbulence Super-Resolution
- Authors: Julia Balla, Jeremiah Bailey, Ali Backour, Elyssa Hofgard, Tommi Jaakkola, Tess Smidt, Ryley McConkey,
- Abstract summary: We show that standard convolutional neural networks (CNNs) can partially acquire rotational symmetry without explicit augmentation or specialized architectures.<n>Using 3D channel-flow with differing anisotropy, we find that models trained on more isotropic mid-plane data achieve lower equivariance error.<n>These results clarify when rotational symmetry must be explicitly incorporated into learning algorithms and when it can be obtained directly from turbulence.
- Score: 2.4095472682599417
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The immense computational cost of simulating turbulence has motivated the use of machine learning approaches for super-resolving turbulent flows. A central challenge is ensuring that learned models respect physical symmetries, such as rotational equivariance. We show that standard convolutional neural networks (CNNs) can partially acquire this symmetry without explicit augmentation or specialized architectures, as turbulence itself provides implicit rotational augmentation in both time and space. Using 3D channel-flow subdomains with differing anisotropy, we find that models trained on more isotropic mid-plane data achieve lower equivariance error than those trained on boundary layer data, and that greater temporal or spatial sampling further reduces this error. We show a distinct scale-dependence of equivariance error that occurs regardless of dataset anisotropy that is consistent with Kolmogorov's local isotropy hypothesis. These results clarify when rotational symmetry must be explicitly incorporated into learning algorithms and when it can be obtained directly from turbulence, enabling more efficient and symmetry-aware super-resolution.
Related papers
- Rethinking Diffusion Models with Symmetries through Canonicalization with Applications to Molecular Graph Generation [56.361076943802594]
CanonFlow achieves state-of-the-art performance on the challenging GEOM-DRUG dataset, and the advantage remains large in few-step generation.
arXiv Detail & Related papers (2026-02-16T18:58:55Z) - Harnessing Equivariance: Modeling Turbulence with Graph Neural Networks [0.0]
This work proposes a novel methodology for turbulence modeling in Large Eddy Simulation (LES) based on Graph Neural Networks (GNNs)<n>GNNs embed the discrete rotational, reflectional and translational symmetries of the Navier-Stokes equations into the model architecture.<n>The suitability of the proposed approach is investigated for two canonical test cases: Homogeneous Isotropic Turbulence (HIT) and turbulent channel flow.
arXiv Detail & Related papers (2025-04-10T13:37:54Z) - How does ion temperature gradient turbulence depend on magnetic geometry? Insights from data and machine learning [3.5086936665880017]
Magnetic geometry has a significant effect on the level of turbulent transport in fusion plasmas.<n>We analyze this dependence using multiple machine learning methods and a dataset of > 200,000 nonlinear simulations.<n>Multiple regression models including convolutional neural networks (CNNs) and decision trees can achieve reasonable predictive power for the heat flux.
arXiv Detail & Related papers (2025-02-17T10:48:26Z) - Rao-Blackwell Gradient Estimators for Equivariant Denoising Diffusion [41.50816120270017]
In domains such as molecular and protein generation, physical systems exhibit inherent symmetries that are critical to model.<n>We present a framework that reduces training variance and provides a provably lower-variance gradient estimator.<n>We also present a practical implementation of this estimator incorporating the loss and sampling procedure through a method we call Orbit Diffusion.
arXiv Detail & Related papers (2025-02-14T03:26:57Z) - Stochastic Reconstruction of Gappy Lagrangian Turbulent Signals by Conditional Diffusion Models [1.7810134788247751]
We present a method for reconstructing missing spatial and velocity data along the trajectories of small objects passively advected by turbulent flows.
Our approach makes use of conditional generative diffusion models, a recently proposed data-driven machine learning technique.
arXiv Detail & Related papers (2024-10-31T14:26:10Z) - Learning Layer-wise Equivariances Automatically using Gradients [66.81218780702125]
Convolutions encode equivariance symmetries into neural networks leading to better generalisation performance.
symmetries provide fixed hard constraints on the functions a network can represent, need to be specified in advance, and can not be adapted.
Our goal is to allow flexible symmetry constraints that can automatically be learned from data using gradients.
arXiv Detail & Related papers (2023-10-09T20:22:43Z) - On Error Propagation of Diffusion Models [77.91480554418048]
We develop a theoretical framework to mathematically formulate error propagation in the architecture of DMs.
We apply the cumulative error as a regularization term to reduce error propagation.
Our proposed regularization reduces error propagation, significantly improves vanilla DMs, and outperforms previous baselines.
arXiv Detail & Related papers (2023-08-09T15:31:17Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - VTAE: Variational Transformer Autoencoder with Manifolds Learning [144.0546653941249]
Deep generative models have demonstrated successful applications in learning non-linear data distributions through a number of latent variables.
The nonlinearity of the generator implies that the latent space shows an unsatisfactory projection of the data space, which results in poor representation learning.
We show that geodesics and accurate computation can substantially improve the performance of deep generative models.
arXiv Detail & Related papers (2023-04-03T13:13:19Z) - Learning Discretized Neural Networks under Ricci Flow [48.47315844022283]
We study Discretized Neural Networks (DNNs) composed of low-precision weights and activations.<n>DNNs suffer from either infinite or zero gradients due to the non-differentiable discrete function during training.
arXiv Detail & Related papers (2023-02-07T10:51:53Z) - Machine learning algorithms for three-dimensional mean-curvature
computation in the level-set method [0.0]
We propose a data-driven mean-curvature solver for the level-set method.
Our proposed system can yield more accurate mean-curvature estimations than modern particle-based interface reconstruction.
arXiv Detail & Related papers (2022-08-18T20:19:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.