Geometric Methods for Sampling, Optimisation, Inference and Adaptive
Agents
- URL: http://arxiv.org/abs/2203.10592v1
- Date: Sun, 20 Mar 2022 16:23:17 GMT
- Title: Geometric Methods for Sampling, Optimisation, Inference and Adaptive
Agents
- Authors: Alessandro Barp, Lancelot Da Costa, Guilherme Fran\c{c}a, Karl
Friston, Mark Girolami, Michael I. Jordan, and Grigorios A. Pavliotis
- Abstract summary: We identify fundamental geometric structures that underlie the problems of sampling, optimisation, inference and adaptive decision-making.
We derive algorithms that exploit these geometric structures to solve these problems efficiently.
- Score: 102.42623636238399
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this chapter, we identify fundamental geometric structures that underlie
the problems of sampling, optimisation, inference and adaptive decision-making.
Based on this identification, we derive algorithms that exploit these geometric
structures to solve these problems efficiently. We show that a wide range of
geometric theories emerge naturally in these fields, ranging from
measure-preserving processes, information divergences, Poisson geometry, and
geometric integration. Specifically, we explain how \emph{(i)} leveraging the
symplectic geometry of Hamiltonian systems enable us to construct (accelerated)
sampling and optimisation methods, \emph{(ii)} the theory of Hilbertian
subspaces and Stein operators provides a general methodology to obtain robust
estimators, \emph{(iii)} preserving the information geometry of decision-making
yields adaptive agents that perform active inference. Throughout, we emphasise
the rich connections between these fields; e.g., inference draws on sampling
and optimisation, and adaptive decision-making assesses decisions by inferring
their counterfactual consequences. Our exposition provides a conceptual
overview of underlying ideas, rather than a technical discussion, which can be
found in the references herein.
Related papers
- Geometrically Inspired Kernel Machines for Collaborative Learning Beyond Gradient Descent [36.59087823764832]
This paper develops a novel mathematical framework for collaborative learning by means of geometrically inspired kernel machines.
For classification problems, this approach allows us to learn bounded geometric structures around given data points.
arXiv Detail & Related papers (2024-07-05T08:20:27Z) - A Unified Theory of Stochastic Proximal Point Methods without Smoothness [52.30944052987393]
Proximal point methods have attracted considerable interest owing to their numerical stability and robustness against imperfect tuning.
This paper presents a comprehensive analysis of a broad range of variations of the proximal point method (SPPM)
arXiv Detail & Related papers (2024-05-24T21:09:19Z) - Adaptive Surface Normal Constraint for Geometric Estimation from Monocular Images [56.86175251327466]
We introduce a novel approach to learn geometries such as depth and surface normal from images while incorporating geometric context.
Our approach extracts geometric context that encodes the geometric variations present in the input image and correlates depth estimation with geometric constraints.
Our method unifies depth and surface normal estimations within a cohesive framework, which enables the generation of high-quality 3D geometry from images.
arXiv Detail & Related papers (2024-02-08T17:57:59Z) - Optimizing Solution-Samplers for Combinatorial Problems: The Landscape
of Policy-Gradient Methods [52.0617030129699]
We introduce a novel theoretical framework for analyzing the effectiveness of DeepMatching Networks and Reinforcement Learning methods.
Our main contribution holds for a broad class of problems including Max-and Min-Cut, Max-$k$-Bipartite-Bi, Maximum-Weight-Bipartite-Bi, and Traveling Salesman Problem.
As a byproduct of our analysis we introduce a novel regularization process over vanilla descent and provide theoretical and experimental evidence that it helps address vanishing-gradient issues and escape bad stationary points.
arXiv Detail & Related papers (2023-10-08T23:39:38Z) - Warped geometric information on the optimisation of Euclidean functions [43.43598316339732]
We consider optimisation of a real-valued function defined in a potentially high-dimensional Euclidean space.
We find the function's optimum along a manifold with a warped metric.
Our proposed algorithm, using 3rd-order approximation of geodesics, tends to outperform standard Euclidean gradient-based counterparts.
arXiv Detail & Related papers (2023-08-16T12:08:50Z) - A Survey of Geometric Optimization for Deep Learning: From Euclidean
Space to Riemannian Manifold [7.737713458418288]
Deep Learning (DL) has achieved success in complex Artificial Intelligence (AI) tasks, but it suffers from various notorious problems.
This article presents a comprehensive survey of applying geometric optimization in DL.
It investigates the application of geometric optimization in different DL networks in various AI tasks, e.g., convolution neural network, recurrent neural network, transfer learning, and optimal transport.
arXiv Detail & Related papers (2023-02-16T10:50:15Z) - The Dynamics of Riemannian Robbins-Monro Algorithms [101.29301565229265]
We propose a family of Riemannian algorithms generalizing and extending the seminal approximation framework of Robbins and Monro.
Compared to their Euclidean counterparts, Riemannian algorithms are much less understood due to lack of a global linear structure on the manifold.
We provide a general template of almost sure convergence results that mirrors and extends the existing theory for Euclidean Robbins-Monro schemes.
arXiv Detail & Related papers (2022-06-14T12:30:11Z) - Learning Geometric Combinatorial Optimization Problems using
Self-attention and Domain Knowledge [0.0]
We propose a novel neural network model that solves COPs involving geometry based on self-attention and a new attention mechanism.
The proposed model is designed such that the model efficiently learns point-to-point relationships in COPs involving geometry using self-attention in the encoder.
In the decoder, a new masking scheme using domain knowledge is proposed to provide a high penalty when the geometric requirement of the problem is not satisfied.
arXiv Detail & Related papers (2021-07-05T01:56:37Z) - A Unifying and Canonical Description of Measure-Preserving Diffusions [60.59592461429012]
A complete recipe of measure-preserving diffusions in Euclidean space was recently derived unifying several MCMC algorithms into a single framework.
We develop a geometric theory that improves and generalises this construction to any manifold.
arXiv Detail & Related papers (2021-05-06T17:36:55Z) - Some Geometrical and Topological Properties of DNNs' Decision Boundaries [4.976129960952446]
We use differential geometry to explore the geometrical and topological properties of decision regions produced by deep neural networks (DNNs)
Based on the Gauss-Bonnet-Chern theorem in differential geometry, we then propose a method to compute the Euler characteristics of compact decision boundaries.
arXiv Detail & Related papers (2020-03-07T23:46:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.