Data-driven discovery of Green's functions
- URL: http://arxiv.org/abs/2210.16016v1
- Date: Fri, 28 Oct 2022 09:41:50 GMT
- Title: Data-driven discovery of Green's functions
- Authors: Nicolas Boull\'e
- Abstract summary: This thesis introduces theoretical results and deep learning algorithms to learn Green's functions associated with linear partial differential equations.
The construction connects the fields of PDE learning and numerical linear algebra.
Rational neural networks (NNs) are introduced and consist of neural networks with trainable rational activation functions.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Discovering hidden partial differential equations (PDEs) and operators from
data is an important topic at the frontier between machine learning and
numerical analysis. This doctoral thesis introduces theoretical results and
deep learning algorithms to learn Green's functions associated with linear
partial differential equations and rigorously justify PDE learning techniques.
A theoretically rigorous algorithm is derived to obtain a learning rate, which
characterizes the amount of training data needed to approximately learn Green's
functions associated with elliptic PDEs. The construction connects the fields
of PDE learning and numerical linear algebra by extending the randomized
singular value decomposition to non-standard Gaussian vectors and
Hilbert--Schmidt operators, and exploiting the low-rank hierarchical structure
of Green's functions using hierarchical matrices. Rational neural networks
(NNs) are introduced and consist of neural networks with trainable rational
activation functions. The highly compositional structure of these networks,
combined with rational approximation theory, implies that rational functions
have higher approximation power than standard activation functions. In
addition, rational NNs may have poles and take arbitrarily large values, which
is ideal for approximating functions with singularities such as Green's
functions. Finally, theoretical results on Green's functions and rational NNs
are combined to design a human-understandable deep learning method for
discovering Green's functions from data. This approach complements
state-of-the-art PDE learning techniques, as a wide range of physics can be
captured from the learned Green's functions such as dominant modes, symmetries,
and singularity locations.
Related papers
- DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - Learning Domain-Independent Green's Function For Elliptic Partial
Differential Equations [0.0]
Green's function characterizes a partial differential equation (PDE) and maps its solution in the entire domain as integrals.
We propose a novel boundary integral network to learn the domain-independent Green's function, referred to as BIN-G.
We demonstrate that our numerical scheme enables fast training and accurate evaluation of the Green's function for PDEs with variable coefficients.
arXiv Detail & Related papers (2024-01-30T17:00:22Z) - Joint Feature and Differentiable $ k $-NN Graph Learning using Dirichlet
Energy [103.74640329539389]
We propose a deep FS method that simultaneously conducts feature selection and differentiable $ k $-NN graph learning.
We employ Optimal Transport theory to address the non-differentiability issue of learning $ k $-NN graphs in neural networks.
We validate the effectiveness of our model with extensive experiments on both synthetic and real-world datasets.
arXiv Detail & Related papers (2023-05-21T08:15:55Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - BI-GreenNet: Learning Green's functions by boundary integral network [14.008606361378149]
Green's function plays a significant role in both theoretical analysis and numerical computing of partial differential equations.
We develop a new method for computing Green's function with high accuracy.
arXiv Detail & Related papers (2022-04-28T01:42:35Z) - Inducing Gaussian Process Networks [80.40892394020797]
We propose inducing Gaussian process networks (IGN), a simple framework for simultaneously learning the feature space as well as the inducing points.
The inducing points, in particular, are learned directly in the feature space, enabling a seamless representation of complex structured domains.
We report on experimental results for real-world data sets showing that IGNs provide significant advances over state-of-the-art methods.
arXiv Detail & Related papers (2022-04-21T05:27:09Z) - Machine-learning custom-made basis functions for partial differential
equations [0.0]
We present an approach for combining deep neural networks with spectral methods to solve PDEs.
We use a deep learning technique known as the Deep Operator Network (DeepONet) to identify candidate functions on which to expand the solution of PDEs.
We exploit the favorable properties of our custom-made basis functions to both study their capability and use them to expand the solution of linear and nonlinear time-dependent PDEs.
arXiv Detail & Related papers (2021-11-09T18:24:23Z) - UNIPoint: Universally Approximating Point Processes Intensities [125.08205865536577]
We provide a proof that a class of learnable functions can universally approximate any valid intensity function.
We implement UNIPoint, a novel neural point process model, using recurrent neural networks to parameterise sums of basis function upon each event.
arXiv Detail & Related papers (2020-07-28T09:31:56Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - PDE constraints on smooth hierarchical functions computed by neural
networks [0.0]
An important problem in the theory of deep neural networks is expressivity.
We study real infinitely differentiable (smooth) hierarchical functions implemented by feedforward neural networks.
We conjecture that such PDE constraints, once accompanied by appropriate non-singularity conditions, guarantee that the smooth function under consideration can be represented by the network.
arXiv Detail & Related papers (2020-05-18T16:34:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.