Homogeneous Artificial Neural Network
- URL: http://arxiv.org/abs/2311.17973v1
- Date: Wed, 29 Nov 2023 16:16:32 GMT
- Title: Homogeneous Artificial Neural Network
- Authors: Andrey Polyakov
- Abstract summary: The paper proposes an artificial neural network (ANN) being a global approximator for a special class of functions.
The homogeneity means a symmetry of a function with respect to a group of transformations having topological characterization of a dilation.
- Score: 0.6526824510982799
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The paper proposes an artificial neural network (ANN) being a global
approximator for a special class of functions, which are known as generalized
homogeneous. The homogeneity means a symmetry of a function with respect to a
group of transformations having topological characterization of a dilation. In
this paper, a class of the so-called linear dilations is considered. A
homogeneous universal approximation theorem is proven. Procedures for an
upgrade of an existing ANN to a homogeneous one are developed. Theoretical
results are supported by examples from the various domains (computer science,
systems theory and automatic control).
Related papers
- Relative Representations: Topological and Geometric Perspectives [53.88896255693922]
Relative representations are an established approach to zero-shot model stitching.
We introduce a normalization procedure in the relative transformation, resulting in invariance to non-isotropic rescalings and permutations.
Second, we propose to deploy topological densification when fine-tuning relative representations, a topological regularization loss encouraging clustering within classes.
arXiv Detail & Related papers (2024-09-17T08:09:22Z) - A tradeoff between universality of equivariant models and learnability
of symmetries [0.0]
We prove that it is impossible to simultaneously learn symmetries and functions equivariant under certain conditions.
We analyze certain families of neural networks for whether they satisfy the conditions of the impossibility result.
On the practical side, our analysis of group-convolutional neural networks allows us generalize the well-known convolution is all you need'' to non-homogeneous spaces.
arXiv Detail & Related papers (2022-10-17T21:23:22Z) - Extending the Universal Approximation Theorem for a Broad Class of
Hypercomplex-Valued Neural Networks [1.0323063834827413]
The universal approximation theorem asserts that a single hidden layer neural network approximates continuous functions with any desired precision on compact sets.
This paper extends the universal approximation theorem for a broad class of hypercomplex-valued neural networks.
arXiv Detail & Related papers (2022-09-06T12:45:15Z) - Powerful Graph Convolutioal Networks with Adaptive Propagation Mechanism
for Homophily and Heterophily [38.50800951799888]
Graph Convolutional Networks (GCNs) have been widely applied in various fields due to their significant power on processing graph-structured data.
Existing methods deal with heterophily by mainly aggregating higher-order neighborhoods or combing the immediate representations.
This paper proposes a novel propagation mechanism, which can automatically change the propagation and aggregation process according to homophily or heterophily.
arXiv Detail & Related papers (2021-12-27T08:19:23Z) - Generalization capabilities of neural networks in lattice applications [0.0]
We investigate the advantages of adopting translationally equivariant neural networks in favor of non-equivariant ones.
We show that our best equivariant architectures can perform and generalize significantly better than their non-equivariant counterparts.
arXiv Detail & Related papers (2021-12-23T11:48:06Z) - Stability of Algebraic Neural Networks to Small Perturbations [179.55535781816343]
Algebraic neural networks (AlgNNs) are composed of a cascade of layers each one associated to and algebraic signal model.
We show how any architecture that uses a formal notion of convolution can be stable beyond particular choices of the shift operator.
arXiv Detail & Related papers (2020-10-22T09:10:16Z) - Algebraic Neural Networks: Stability to Deformations [179.55535781816343]
We study algebraic neural networks (AlgNNs) with commutative algebras.
AlgNNs unify diverse architectures such as Euclidean convolutional neural networks, graph neural networks, and group neural networks.
arXiv Detail & Related papers (2020-09-03T03:41:38Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z) - Coupling-based Invertible Neural Networks Are Universal Diffeomorphism
Approximators [72.62940905965267]
Invertible neural networks based on coupling flows (CF-INNs) have various machine learning applications such as image synthesis and representation learning.
Are CF-INNs universal approximators for invertible functions?
We prove a general theorem to show the equivalence of the universality for certain diffeomorphism classes.
arXiv Detail & Related papers (2020-06-20T02:07:37Z) - Generalizing Convolutional Neural Networks for Equivariance to Lie
Groups on Arbitrary Continuous Data [52.78581260260455]
We propose a general method to construct a convolutional layer that is equivariant to transformations from any specified Lie group.
We apply the same model architecture to images, ball-and-stick molecular data, and Hamiltonian dynamical systems.
arXiv Detail & Related papers (2020-02-25T17:40:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.