Deep Micro Solvers for Rough-Wall Stokes Flow in a Heterogeneous Multiscale Method
- URL: http://arxiv.org/abs/2507.13902v1
- Date: Fri, 18 Jul 2025 13:29:24 GMT
- Title: Deep Micro Solvers for Rough-Wall Stokes Flow in a Heterogeneous Multiscale Method
- Authors: Emanuel Ström, Anna-Karin Tornberg, Ozan Öktem,
- Abstract summary: We propose a learned precomputation for rough-wall Stokes flow.<n>A network is designed to map from the local wall geometry to the Riesz representors for the corresponding local flow averages.<n>The accuracy in the HMM solution for the macroscopic flow is comparable to when the local (micro) problems are solved using a classical approach.
- Score: 1.747820331822631
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose a learned precomputation for the heterogeneous multiscale method (HMM) for rough-wall Stokes flow. A Fourier neural operator is used to approximate local averages over microscopic subsets of the flow, which allows to compute an effective slip length of the fluid away from the roughness. The network is designed to map from the local wall geometry to the Riesz representors for the corresponding local flow averages. With such a parameterisation, the network only depends on the local wall geometry and as such can be trained independent of boundary conditions. We perform a detailed theoretical analysis of the statistical error propagation, and prove that under suitable regularity and scaling assumptions, a bounded training loss leads to a bounded error in the resulting macroscopic flow. We then demonstrate on a family of test problems that the learned precomputation performs stably with respect to the scale of the roughness. The accuracy in the HMM solution for the macroscopic flow is comparable to when the local (micro) problems are solved using a classical approach, while the computational cost of solving the micro problems is significantly reduced.
Related papers
- Weighted quantization using MMD: From mean field to mean shift via gradient flows [5.216151302783165]
We show that a Wasserstein-Fisher-Rao gradient flow is well-suited for designing quantizations optimal under MMD.<n>We derive a new fixed-point algorithm called mean shift interacting particles (MSIP)<n>Our unification of gradient flows, mean shift, and MMD-optimal quantization yields algorithms more robust than state-of-the-art methods.
arXiv Detail & Related papers (2025-02-14T23:13:20Z) - Flow-based Distributionally Robust Optimization [23.232731771848883]
We present a framework, called $textttFlowDRO$, for solving flow-based distributionally robust optimization (DRO) problems with Wasserstein uncertainty sets.
We aim to find continuous worst-case distribution (also called the Least Favorable Distribution, LFD) and sample from it.
We demonstrate its usage in adversarial learning, distributionally robust hypothesis testing, and a new mechanism for data-driven distribution perturbation differential privacy.
arXiv Detail & Related papers (2023-10-30T03:53:31Z) - Noise-Free Sampling Algorithms via Regularized Wasserstein Proximals [3.4240632942024685]
We consider the problem of sampling from a distribution governed by a potential function.
This work proposes an explicit score based MCMC method that is deterministic, resulting in a deterministic evolution for particles.
arXiv Detail & Related papers (2023-08-28T23:51:33Z) - A Neural Network Approach for Homogenization of Multiscale Problems [1.6244541005112747]
We propose a neural network-based approach to the homogenization of multiscale problems.
The proposed method incorporates Brownian walkers to find the macroscopic description of a multiscale PDE solution.
We validate the efficiency and robustness of the proposed method through a suite of linear and nonlinear multiscale problems.
arXiv Detail & Related papers (2022-06-04T17:50:00Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Variational encoder geostatistical analysis (VEGAS) with an application
to large scale riverine bathymetry [1.2093180801186911]
Estimation of riverbed profiles, also known as bathymetry, plays a vital role in many applications.
We propose a reduced-order model (ROM) based approach that utilizes a variational autoencoder (VAE), a type of deep neural network with a narrow layer in the middle.
We have tested our inversion approach on a one-mile reach of the Savannah River, GA, USA.
arXiv Detail & Related papers (2021-11-23T08:27:48Z) - Physics-Informed Machine Learning Method for Large-Scale Data
Assimilation Problems [48.7576911714538]
We extend the physics-informed conditional Karhunen-Lo'eve expansion (PICKLE) method for modeling subsurface flow with unknown flux (Neumann) and varying head (Dirichlet) boundary conditions.
We demonstrate that the PICKLE method is comparable in accuracy with the standard maximum a posteriori (MAP) method, but is significantly faster than MAP for large-scale problems.
arXiv Detail & Related papers (2021-07-30T18:43:14Z) - Local AdaGrad-Type Algorithm for Stochastic Convex-Concave Minimax
Problems [80.46370778277186]
Large scale convex-concave minimax problems arise in numerous applications, including game theory, robust training, and training of generative adversarial networks.
We develop a communication-efficient distributed extragrad algorithm, LocalAdaSient, with an adaptive learning rate suitable for solving convex-concave minimax problem in the.
Server model.
We demonstrate its efficacy through several experiments in both the homogeneous and heterogeneous settings.
arXiv Detail & Related papers (2021-06-18T09:42:05Z) - Machine Learning and Variational Algorithms for Lattice Field Theory [1.198562319289569]
In lattice quantum field theory studies, parameters defining the lattice theory must be tuned toward criticality to access continuum physics.
We introduce an approach to "deform" Monte Carlo estimators based on contour deformations applied to the domain of the path integral.
We demonstrate that flow-based MCMC can mitigate critical slowing down and observifolds can exponentially reduce variance in proof-of-principle applications.
arXiv Detail & Related papers (2021-06-03T16:37:05Z) - Continuous Wasserstein-2 Barycenter Estimation without Minimax
Optimization [94.18714844247766]
Wasserstein barycenters provide a geometric notion of the weighted average of probability measures based on optimal transport.
We present a scalable algorithm to compute Wasserstein-2 barycenters given sample access to the input measures.
arXiv Detail & Related papers (2021-02-02T21:01:13Z) - Amortized Conditional Normalized Maximum Likelihood: Reliable Out of
Distribution Uncertainty Estimation [99.92568326314667]
We propose the amortized conditional normalized maximum likelihood (ACNML) method as a scalable general-purpose approach for uncertainty estimation.
Our algorithm builds on the conditional normalized maximum likelihood (CNML) coding scheme, which has minimax optimal properties according to the minimum description length principle.
We demonstrate that ACNML compares favorably to a number of prior techniques for uncertainty estimation in terms of calibration on out-of-distribution inputs.
arXiv Detail & Related papers (2020-11-05T08:04:34Z) - Learning Likelihoods with Conditional Normalizing Flows [54.60456010771409]
Conditional normalizing flows (CNFs) are efficient in sampling and inference.
We present a study of CNFs where the base density to output space mapping is conditioned on an input x, to model conditional densities p(y|x)
arXiv Detail & Related papers (2019-11-29T19:17:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.