Neural Network Approximations of Compositional Functions With
Applications to Dynamical Systems
- URL: http://arxiv.org/abs/2012.01698v1
- Date: Thu, 3 Dec 2020 04:40:25 GMT
- Title: Neural Network Approximations of Compositional Functions With
Applications to Dynamical Systems
- Authors: Wei Kang and Qi Gong
- Abstract summary: We develop an approximation theory for compositional functions and their neural network approximations.
We identify a set of key features of compositional functions and the relationship between the features and the complexity of neural networks.
In addition to function approximations, we prove several formulae of error upper bounds for neural networks.
- Score: 3.660098145214465
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: As demonstrated in many areas of real-life applications, neural networks have
the capability of dealing with high dimensional data. In the fields of optimal
control and dynamical systems, the same capability was studied and verified in
many published results in recent years. Towards the goal of revealing the
underlying reason why neural networks are capable of solving some high
dimensional problems, we develop an algebraic framework and an approximation
theory for compositional functions and their neural network approximations. The
theoretical foundation is developed in a way so that it supports the error
analysis for not only functions as input-output relations, but also numerical
algorithms. This capability is critical because it enables the analysis of
approximation errors for problems for which analytic solutions are not
available, such as differential equations and optimal control. We identify a
set of key features of compositional functions and the relationship between the
features and the complexity of neural networks. In addition to function
approximations, we prove several formulae of error upper bounds for neural
networks that approximate the solutions to differential equations,
optimization, and optimal control.
Related papers
- Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - Differentiable Visual Computing for Inverse Problems and Machine
Learning [27.45555082573493]
Visual computing methods are used to analyze geometry, physically simulate solids, fluids, and other media, and render the world via optical techniques.
Deep learning (DL) allows for the construction of general algorithmic models, side stepping the need for a purely first principles-based approach to problem solving.
DL is powered by highly parameterized neural network architectures -- universal function approximators -- and gradient-based search algorithms.
arXiv Detail & Related papers (2023-11-21T23:02:58Z) - An Analysis of Physics-Informed Neural Networks [0.0]
We present a new approach to approximating the solution to physical systems - physics-informed neural networks.
The concept of artificial neural networks is introduced, the objective function is defined, and optimisation strategies are discussed.
The partial differential equation is then included as a constraint in the loss function for the problem, giving the network access to knowledge of the dynamics of the physical system it is modelling.
arXiv Detail & Related papers (2023-03-06T04:45:53Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - Physics-aware deep learning framework for linear elasticity [0.0]
The paper presents an efficient and robust data-driven deep learning (DL) computational framework for linear continuum elasticity problems.
For an accurate representation of the field variables, a multi-objective loss function is proposed.
Several benchmark problems including the Airimaty solution to elasticity and the Kirchhoff-Love plate problem are solved.
arXiv Detail & Related papers (2023-02-19T20:33:32Z) - On the Approximation and Complexity of Deep Neural Networks to Invariant
Functions [0.0]
We study the approximation and complexity of deep neural networks to invariant functions.
We show that a broad range of invariant functions can be approximated by various types of neural network models.
We provide a feasible application that connects the parameter estimation and forecasting of high-resolution signals with our theoretical conclusions.
arXiv Detail & Related papers (2022-10-27T09:19:19Z) - Multigoal-oriented dual-weighted-residual error estimation using deep
neural networks [0.0]
Deep learning is considered as a powerful tool with high flexibility to approximate functions.
Our approach is based on a posteriori error estimation in which the adjoint problem is solved for the error localization.
An efficient and easy to implement algorithm is developed to obtain a posteriori error estimate for multiple goal functionals.
arXiv Detail & Related papers (2021-12-21T16:59:44Z) - Physics informed neural networks for continuum micromechanics [68.8204255655161]
Recently, physics informed neural networks have successfully been applied to a broad variety of problems in applied mathematics and engineering.
Due to the global approximation, physics informed neural networks have difficulties in displaying localized effects and strong non-linear solutions by optimization.
It is shown, that the domain decomposition approach is able to accurately resolve nonlinear stress, displacement and energy fields in heterogeneous microstructures obtained from real-world $mu$CT-scans.
arXiv Detail & Related papers (2021-10-14T14:05:19Z) - Large-scale Neural Solvers for Partial Differential Equations [48.7576911714538]
Solving partial differential equations (PDE) is an indispensable part of many branches of science as many processes can be modelled in terms of PDEs.
Recent numerical solvers require manual discretization of the underlying equation as well as sophisticated, tailored code for distributed computing.
We examine the applicability of continuous, mesh-free neural solvers for partial differential equations, physics-informed neural networks (PINNs)
We discuss the accuracy of GatedPINN with respect to analytical solutions -- as well as state-of-the-art numerical solvers, such as spectral solvers.
arXiv Detail & Related papers (2020-09-08T13:26:51Z) - Learning Connectivity of Neural Networks from a Topological Perspective [80.35103711638548]
We propose a topological perspective to represent a network into a complete graph for analysis.
By assigning learnable parameters to the edges which reflect the magnitude of connections, the learning process can be performed in a differentiable manner.
This learning process is compatible with existing networks and owns adaptability to larger search spaces and different tasks.
arXiv Detail & Related papers (2020-08-19T04:53:31Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.