Robust normalizing flows using Bernstein-type polynomials
- URL: http://arxiv.org/abs/2102.03509v1
- Date: Sat, 6 Feb 2021 04:32:05 GMT
- Title: Robust normalizing flows using Bernstein-type polynomials
- Authors: Sameera Ramasinghe, Kasun Fernando, Salman Khan, Nick Barnes
- Abstract summary: Normalizing flows (NFs) are a class of generative models that allow exact density evaluation and sampling.
We propose a framework to construct NFs based on increasing triangular maps and Bernstein-types.
We empirically demonstrate the efficacy of the proposed technique using experiments on both real-world and synthetic datasets.
- Score: 31.533158456141305
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Normalizing flows (NFs) are a class of generative models that allows exact
density evaluation and sampling. We propose a framework to construct NFs based
on increasing triangular maps and Bernstein-type polynomials. Compared to the
existing (universal) NF frameworks, our method provides compelling advantages
like theoretical upper bounds for the approximation error, robustness, higher
interpretability, suitability for compactly supported densities, and the
ability to employ higher degree polynomials without training instability.
Moreover, we provide a constructive universality proof, which gives analytic
expressions of the approximations for known transformations. We conduct a
thorough theoretical analysis and empirically demonstrate the efficacy of the
proposed technique using experiments on both real-world and synthetic datasets.
Related papers
- Input convex neural networks: universal approximation theorem and implementation for isotropic polyconvex hyperelastic energies [0.0]
This paper presents a novel framework that enforces necessary physical and mathematical constraints while simultaneously satisfying the universal mathematical constraints.
A universal theorem for the proposed approach is proven.
The proposed network can any frame-in, isotropic poly energy (provided the network is large)
Existing approaches identify the advantages of the proposed method.
arXiv Detail & Related papers (2025-02-12T16:15:03Z) - Topological Eigenvalue Theorems for Tensor Analysis in Multi-Modal Data Fusion [0.0]
This paper presents a novel framework for tensor eigenvalue analysis in the context of multi-modal data fusion.
By establishing new theorems that link eigenvalues to topological features, the proposed framework provides deeper insights into the latent structure of data.
arXiv Detail & Related papers (2024-09-14T09:46:15Z) - Assessment of Uncertainty Quantification in Universal Differential Equations [1.374796982212312]
Universal Differential Equations (UDEs) are used to combine prior knowledge in the form of mechanistic formulations with universal function approximators, like neural networks.
We provide a formalisation of uncertainty quantification (UQ) for UDEs and investigate important frequentist and Bayesian methods.
arXiv Detail & Related papers (2024-06-13T06:36:19Z) - A Unified Theory of Stochastic Proximal Point Methods without Smoothness [52.30944052987393]
Proximal point methods have attracted considerable interest owing to their numerical stability and robustness against imperfect tuning.
This paper presents a comprehensive analysis of a broad range of variations of the proximal point method (SPPM)
arXiv Detail & Related papers (2024-05-24T21:09:19Z) - Finite-dimensional approximations of push-forwards on locally analytic functionals [5.787117733071417]
Our approach is to consider the push-forward on the space of locally analytic functionals, instead of directly handling the analytic map itself.
We establish a methodology enabling appropriate finite-dimensional approximation of the push-forward from finite discrete data.
arXiv Detail & Related papers (2024-04-16T17:53:59Z) - Unveil Conditional Diffusion Models with Classifier-free Guidance: A Sharp Statistical Theory [87.00653989457834]
Conditional diffusion models serve as the foundation of modern image synthesis and find extensive application in fields like computational biology and reinforcement learning.
Despite the empirical success, theory of conditional diffusion models is largely missing.
This paper bridges the gap by presenting a sharp statistical theory of distribution estimation using conditional diffusion models.
arXiv Detail & Related papers (2024-03-18T17:08:24Z) - Gauge-equivariant flow models for sampling in lattice field theories
with pseudofermions [51.52945471576731]
This work presents gauge-equivariant architectures for flow-based sampling in fermionic lattice field theories using pseudofermions as estimators for the fermionic determinant.
This is the default approach in state-of-the-art lattice field theory calculations, making this development critical to the practical application of flow models to theories such as QCD.
arXiv Detail & Related papers (2022-07-18T21:13:34Z) - Controlling the Complexity and Lipschitz Constant improves polynomial
nets [55.121200972539114]
We derive new complexity bounds for the set of Coupled CP-Decomposition (CCP) and Nested Coupled CP-decomposition (NCP) models of Polynomial Nets.
We propose a principled regularization scheme that we evaluate experimentally in six datasets and show that it improves the accuracy as well as the robustness of the models to adversarial perturbations.
arXiv Detail & Related papers (2022-02-10T14:54:29Z) - Triangular Flows for Generative Modeling: Statistical Consistency,
Smoothness Classes, and Fast Rates [8.029049649310211]
Triangular flows, also known as Kn"othe-Rosenblatt measure couplings, comprise an important building block of normalizing flow models.
We present statistical guarantees and sample complexity bounds for triangular flow statistical models.
arXiv Detail & Related papers (2021-12-31T18:57:37Z) - Optimizing Information-theoretical Generalization Bounds via Anisotropic
Noise in SGLD [73.55632827932101]
We optimize the information-theoretical generalization bound by manipulating the noise structure in SGLD.
We prove that with constraint to guarantee low empirical risk, the optimal noise covariance is the square root of the expected gradient covariance.
arXiv Detail & Related papers (2021-10-26T15:02:27Z) - A Unified Framework for Coupled Tensor Completion [42.19293115131073]
Coupled tensor decomposition reveals the joint data structure by incorporating priori knowledge that come from the latent coupled factors.
The TR has powerful expression ability and achieves success in some multi-dimensional data processing applications.
The proposed method is validated on numerical experiments on synthetic data, and experimental results on real-world data demonstrate its superiority over the state-of-the-art methods in terms of recovery accuracy.
arXiv Detail & Related papers (2020-01-09T02:15:46Z) - Learning Likelihoods with Conditional Normalizing Flows [54.60456010771409]
Conditional normalizing flows (CNFs) are efficient in sampling and inference.
We present a study of CNFs where the base density to output space mapping is conditioned on an input x, to model conditional densities p(y|x)
arXiv Detail & Related papers (2019-11-29T19:17:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.