Universal approximation property of ODENet and ResNet with a single activation function
- URL: http://arxiv.org/abs/2410.16709v1
- Date: Tue, 22 Oct 2024 05:27:01 GMT
- Title: Universal approximation property of ODENet and ResNet with a single activation function
- Authors: Masato Kimura, Kazunori Matsui, Yosuke Mizuno,
- Abstract summary: We study a universal approximation property of ODENet and ResNet.
We show that such an ODENet and ResNet with a restricted vector field can uniformly approximate ODENet with a general vector field.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We study a universal approximation property of ODENet and ResNet. The ODENet is a map from an initial value to the final value of an ODE system in a finite interval. It is considered a mathematical model of a ResNet-type deep learning system. We consider dynamical systems with vector fields given by a single composition of the activation function and an affine mapping, which is the most common choice of the ODENet or ResNet vector field in actual machine learning systems. We show that such an ODENet and ResNet with a restricted vector field can uniformly approximate ODENet with a general vector field.
Related papers
- An Intrinsic Vector Heat Network [64.55434397799728]
This paper introduces a novel neural network architecture for learning tangent vector fields embedded in 3D.
We introduce a trainable vector heat diffusion module to spatially propagate vector-valued feature data across the surface.
We also demonstrate the effectiveness of our method on the useful industrial application of quadrilateral mesh generation.
arXiv Detail & Related papers (2024-06-14T00:40:31Z) - GIT-Net: Generalized Integral Transform for Operator Learning [58.13313857603536]
This article introduces GIT-Net, a deep neural network architecture for approximating Partial Differential Equation (PDE) operators.
GIT-Net harnesses the fact that differential operators commonly used for defining PDEs can often be represented parsimoniously when expressed in specialized functional bases.
Numerical experiments demonstrate that GIT-Net is a competitive neural network operator, exhibiting small test errors and low evaluations across a range of PDE problems.
arXiv Detail & Related papers (2023-12-05T03:03:54Z) - Faster Training of Neural ODEs Using Gau{\ss}-Legendre Quadrature [68.9206193762751]
We propose an alternative way to speed up the training of neural ODEs.
We use Gauss-Legendre quadrature to solve integrals faster than ODE-based methods.
We also extend the idea to training SDEs using the Wong-Zakai theorem, by training a corresponding ODE and transferring the parameters.
arXiv Detail & Related papers (2023-08-21T11:31:15Z) - From NeurODEs to AutoencODEs: a mean-field control framework for
width-varying Neural Networks [68.8204255655161]
We propose a new type of continuous-time control system, called AutoencODE, based on a controlled field that drives dynamics.
We show that many architectures can be recovered in regions where the loss function is locally convex.
arXiv Detail & Related papers (2023-07-05T13:26:17Z) - Neural Generalized Ordinary Differential Equations with Layer-varying
Parameters [1.3691539554014036]
We show that the layer-varying Neural-GODE is more flexible and general than the standard Neural-ODE.
The Neural-GODE enjoys the computational and memory benefits while performing comparably to ResNets in prediction accuracy.
arXiv Detail & Related papers (2022-09-21T20:02:28Z) - Learning Multi-Object Dynamics with Compositional Neural Radiance Fields [63.424469458529906]
We present a method to learn compositional predictive models from image observations based on implicit object encoders, Neural Radiance Fields (NeRFs), and graph neural networks.
NeRFs have become a popular choice for representing scenes due to their strong 3D prior.
For planning, we utilize RRTs in the learned latent space, where we can exploit our model and the implicit object encoder to make sampling the latent space informative and more efficient.
arXiv Detail & Related papers (2022-02-24T01:31:29Z) - HeunNet: Extending ResNet using Heun's Methods [1.0071258008543083]
HeunNet is a predictor-corrector variant of ResNet.
Heun's method is more accurate than Euler's.
HeunNet achieves high accuracy with low computational time.
arXiv Detail & Related papers (2021-05-13T09:55:26Z) - Artificial neural network as a universal model of nonlinear dynamical
systems [0.0]
The map is built as an artificial neural network whose weights encode a modeled system.
We consider the Lorenz system, the Roessler system and also Hindmarch-Rose neuron.
High similarity is observed for visual images of attractors, power spectra, bifurcation diagrams and Lyapunovs exponents.
arXiv Detail & Related papers (2021-03-06T16:02:41Z) - Accelerating ODE-Based Neural Networks on Low-Cost FPGAs [3.4795226670772745]
ODENet is a deep neural network architecture in which a stacking structure of ResNet is implemented with an ordinary differential equation solver.
It can reduce the number of parameters and strike a balance between accuracy and performance by selecting a proper solver.
It is also possible to improve the accuracy while keeping the same number of parameters on resource-limited edge devices.
arXiv Detail & Related papers (2020-12-31T06:39:22Z) - Universal Approximation Properties for an ODENet and a ResNet:
Mathematical Analysis and Numerical Experiments [0.0]
We prove a universal approximation property (UAP) for a class of ODENet and a class of ResNet.
We use this to construct a learning algorithm for ODENet.
arXiv Detail & Related papers (2020-12-22T06:04:09Z) - Deep Polynomial Neural Networks [77.70761658507507]
$Pi$Nets are a new class of function approximators based on expansions.
$Pi$Nets produce state-the-art results in three challenging tasks, i.e. image generation, face verification and 3D mesh representation learning.
arXiv Detail & Related papers (2020-06-20T16:23:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.