Data-driven learning of robust nonlocal physics from high-fidelity
synthetic data
- URL: http://arxiv.org/abs/2005.10076v1
- Date: Sun, 17 May 2020 22:53:14 GMT
- Title: Data-driven learning of robust nonlocal physics from high-fidelity
synthetic data
- Authors: Huaiqian You, Yue Yu, Nathaniel Trask, Mamikon Gulian, Marta D'Elia
- Abstract summary: Key challenge to nonlocal models is the analytical complexity of deriving them from first principles, and frequently their use is justified a posteriori.
In this work we extract nonlocal models from data, circumventing these challenges and providing data-driven justification for the resulting model form.
- Score: 3.9181541460605116
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A key challenge to nonlocal models is the analytical complexity of deriving
them from first principles, and frequently their use is justified a posteriori.
In this work we extract nonlocal models from data, circumventing these
challenges and providing data-driven justification for the resulting model
form. Extracting provably robust data-driven surrogates is a major challenge
for machine learning (ML) approaches, due to nonlinearities and lack of
convexity. Our scheme allows extraction of provably invertible nonlocal models
whose kernels may be partially negative. To achieve this, based on established
nonlocal theory, we embed in our algorithm sufficient conditions on the
non-positive part of the kernel that guarantee well-posedness of the learnt
operator. These conditions are imposed as inequality constraints and ensure
that models are robust, even in small-data regimes. We demonstrate this
workflow for a range of applications, including reproduction of manufactured
nonlocal kernels; numerical homogenization of Darcy flow associated with a
heterogeneous periodic microstructure; nonlocal approximation to high-order
local transport phenomena; and approximation of globally supported fractional
diffusion operators by truncated kernels.
Related papers
- Marginal Causal Flows for Validation and Inference [3.547529079746247]
Investigating the marginal causal effect of an intervention on an outcome from complex data remains challenging.
We introduce Frugal Flows, a novel likelihood-based machine learning model that uses normalising flows to flexibly learn the data-generating process.
We demonstrate the above with experiments on both simulated and real-world datasets.
arXiv Detail & Related papers (2024-11-02T16:04:57Z) - Hybrid Top-Down Global Causal Discovery with Local Search for Linear and Nonlinear Additive Noise Models [2.0738462952016232]
Methods based on functional causal models can identify a unique graph, but suffer from the curse of dimensionality or impose strong parametric assumptions.
We propose a novel hybrid approach for global causal discovery in observational data that leverages local causal substructures.
We provide theoretical guarantees for correctness and worst-case time complexities, with empirical validation on synthetic data.
arXiv Detail & Related papers (2024-05-23T12:28:16Z) - Physics-Informed Diffusion Models [0.0]
We present a framework to inform denoising diffusion models of underlying constraints on generated samples during model training.
Our approach improves the alignment of the generated samples with the imposed constraints and significantly outperforms existing methods.
arXiv Detail & Related papers (2024-03-21T13:52:55Z) - Reflected Diffusion Models [93.26107023470979]
We present Reflected Diffusion Models, which reverse a reflected differential equation evolving on the support of the data.
Our approach learns the score function through a generalized score matching loss and extends key components of standard diffusion models.
arXiv Detail & Related papers (2023-04-10T17:54:38Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - ManiFlow: Implicitly Representing Manifolds with Normalizing Flows [145.9820993054072]
Normalizing Flows (NFs) are flexible explicit generative models that have been shown to accurately model complex real-world data distributions.
We propose an optimization objective that recovers the most likely point on the manifold given a sample from the perturbed distribution.
Finally, we focus on 3D point clouds for which we utilize the explicit nature of NFs, i.e. surface normals extracted from the gradient of the log-likelihood and the log-likelihood itself.
arXiv Detail & Related papers (2022-08-18T16:07:59Z) - Toward Certified Robustness Against Real-World Distribution Shifts [65.66374339500025]
We train a generative model to learn perturbations from data and define specifications with respect to the output of the learned model.
A unique challenge arising from this setting is that existing verifiers cannot tightly approximate sigmoid activations.
We propose a general meta-algorithm for handling sigmoid activations which leverages classical notions of counter-example-guided abstraction refinement.
arXiv Detail & Related papers (2022-06-08T04:09:13Z) - Nonparametric learning of kernels in nonlocal operators [6.314604944530131]
We provide a rigorous identifiability analysis and convergence study for the learning of kernels in nonlocal operators.
We propose a nonparametric regression algorithm with a novel data adaptive RKHS Tikhonov regularization method based on the function space of identifiability.
arXiv Detail & Related papers (2022-05-23T02:47:55Z) - Goal-directed Generation of Discrete Structures with Conditional
Generative Models [85.51463588099556]
We introduce a novel approach to directly optimize a reinforcement learning objective, maximizing an expected reward.
We test our methodology on two tasks: generating molecules with user-defined properties and identifying short python expressions which evaluate to a given target value.
arXiv Detail & Related papers (2020-10-05T20:03:13Z) - Learning Likelihoods with Conditional Normalizing Flows [54.60456010771409]
Conditional normalizing flows (CNFs) are efficient in sampling and inference.
We present a study of CNFs where the base density to output space mapping is conditioned on an input x, to model conditional densities p(y|x)
arXiv Detail & Related papers (2019-11-29T19:17:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.