Deep Neural Network Approximation of Invariant Functions through
Dynamical Systems
- URL: http://arxiv.org/abs/2208.08707v1
- Date: Thu, 18 Aug 2022 08:36:16 GMT
- Title: Deep Neural Network Approximation of Invariant Functions through
Dynamical Systems
- Authors: Qianxiao Li, Ting Lin, Zuowei Shen
- Abstract summary: We study the approximation of functions which are invariant with respect to certain permutations of the input indices using flow maps of dynamical systems.
Such invariant functions includes the much studied translation-invariant ones involving image tasks, but also encompasses many permutation-invariant functions that finds emerging applications in science and engineering.
- Score: 15.716533830931766
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the approximation of functions which are invariant with respect to
certain permutations of the input indices using flow maps of dynamical systems.
Such invariant functions includes the much studied translation-invariant ones
involving image tasks, but also encompasses many permutation-invariant
functions that finds emerging applications in science and engineering. We prove
sufficient conditions for universal approximation of these functions by a
controlled equivariant dynamical system, which can be viewed as a general
abstraction of deep residual networks with symmetry constraints. These results
not only imply the universal approximation for a variety of commonly employed
neural network architectures for symmetric function approximation, but also
guide the design of architectures with approximation guarantees for
applications involving new symmetry requirements.
Related papers
- Neural Control Variates with Automatic Integration [49.91408797261987]
This paper proposes a novel approach to construct learnable parametric control variates functions from arbitrary neural network architectures.
We use the network to approximate the anti-derivative of the integrand.
We apply our method to solve partial differential equations using the Walk-on-sphere algorithm.
arXiv Detail & Related papers (2024-09-23T06:04:28Z) - Universal Neural Functionals [67.80283995795985]
A challenging problem in many modern machine learning tasks is to process weight-space features.
Recent works have developed promising weight-space models that are equivariant to the permutation symmetries of simple feedforward networks.
This work proposes an algorithm that automatically constructs permutation equivariant models for any weight space.
arXiv Detail & Related papers (2024-02-07T20:12:27Z) - A Constructive Approach to Function Realization by Neural Stochastic
Differential Equations [8.04975023021212]
We introduce structural restrictions on system dynamics and characterize the class of functions that can be realized by such a system.
The systems are implemented as a cascade interconnection of a neural differential equation (Neural SDE), a deterministic dynamical system, and a readout map.
arXiv Detail & Related papers (2023-07-01T03:44:46Z) - Permutation Equivariant Neural Functionals [92.0667671999604]
This work studies the design of neural networks that can process the weights or gradients of other neural networks.
We focus on the permutation symmetries that arise in the weights of deep feedforward networks because hidden layer neurons have no inherent order.
In our experiments, we find that permutation equivariant neural functionals are effective on a diverse set of tasks.
arXiv Detail & Related papers (2023-02-27T18:52:38Z) - Unifying O(3) Equivariant Neural Networks Design with Tensor-Network Formalism [12.008737454250463]
We propose using fusion diagrams, a technique widely employed in simulating SU($2$)-symmetric quantum many-body problems, to design new equivariant components for equivariant neural networks.
When applied to particles within a given local neighborhood, the resulting components, which we term "fusion blocks," serve as universal approximators of any continuous equivariant function.
Our approach, which combines tensor networks with equivariant neural networks, suggests a potentially fruitful direction for designing more expressive equivariant neural networks.
arXiv Detail & Related papers (2022-11-14T16:06:59Z) - Generalization capabilities of neural networks in lattice applications [0.0]
We investigate the advantages of adopting translationally equivariant neural networks in favor of non-equivariant ones.
We show that our best equivariant architectures can perform and generalize significantly better than their non-equivariant counterparts.
arXiv Detail & Related papers (2021-12-23T11:48:06Z) - Frame Averaging for Invariant and Equivariant Network Design [50.87023773850824]
We introduce Frame Averaging (FA), a framework for adapting known (backbone) architectures to become invariant or equivariant to new symmetry types.
We show that FA-based models have maximal expressive power in a broad setting.
We propose a new class of universal Graph Neural Networks (GNNs), universal Euclidean motion invariant point cloud networks, and Euclidean motion invariant Message Passing (MP) GNNs.
arXiv Detail & Related papers (2021-10-07T11:05:23Z) - Fractal Structure and Generalization Properties of Stochastic
Optimization Algorithms [71.62575565990502]
We prove that the generalization error of an optimization algorithm can be bounded on the complexity' of the fractal structure that underlies its generalization measure.
We further specialize our results to specific problems (e.g., linear/logistic regression, one hidden/layered neural networks) and algorithms.
arXiv Detail & Related papers (2021-06-09T08:05:36Z) - A Functional Perspective on Learning Symmetric Functions with Neural
Networks [48.80300074254758]
We study the learning and representation of neural networks defined on measures.
We establish approximation and generalization bounds under different choices of regularization.
The resulting models can be learned efficiently and enjoy generalization guarantees that extend across input sizes.
arXiv Detail & Related papers (2020-08-16T16:34:33Z) - On Representing (Anti)Symmetric Functions [19.973896010415977]
We derive natural approximations in the symmetric case, and approximations based on a single generalized Slater in the anti-symmetric case.
We provide a complete and explicit proof of the Equivariant MultiLayer Perceptron, which implies universality of symmetric universalitys and the FermiNet.
arXiv Detail & Related papers (2020-07-30T08:23:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.