Deep Learning of Multivariate Extremes via a Geometric Representation
- URL: http://arxiv.org/abs/2406.19936v2
- Date: Fri, 13 Sep 2024 11:18:42 GMT
- Title: Deep Learning of Multivariate Extremes via a Geometric Representation
- Authors: Callum J. R. Murphy-Barltrop, Reetam Majumder, Jordan Richards,
- Abstract summary: We study the study of geometric extremes, where extremal dependence properties are inferred from the limiting shapes of scaled sample clouds.
We propose a range of novel theoretical results to aid with the implementation of the geometric extremes framework.
We introduce the first approach to modelling limit sets using deep learning neural networks.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The study of geometric extremes, where extremal dependence properties are inferred from the deterministic limiting shapes of scaled sample clouds, provides an exciting approach to modelling the extremes of multivariate data. These shapes, termed limit sets, link together several popular extremal dependence modelling frameworks. Although the geometric approach is becoming an increasingly popular modelling tool, current inference techniques are limited to a low dimensional setting (d < 5), and generally require rigid modelling assumptions. In this work, we propose a range of novel theoretical results to aid with the implementation of the geometric extremes framework and introduce the first approach to modelling limit sets using deep learning. By leveraging neural networks, we construct asymptotically-justified yet flexible semi-parametric models for extremal dependence of high-dimensional data. We showcase the efficacy of our deep approach by modelling the complex extremal dependencies between meteorological and oceanographic variables in the North Sea off the coast of the UK.
Related papers
- Deep learning joint extremes of metocean variables using the SPAR model [0.0]
This paper presents a novel deep learning framework for estimating multivariate joint extremes of metocean variables.
We show how the method can be applied in higher dimensions, using a case study for five metocean variables.
Our data-driven approach provides great flexibility in the dependence structures that can be represented.
arXiv Detail & Related papers (2024-12-20T11:39:07Z) - Geometry Distributions [51.4061133324376]
We propose a novel geometric data representation that models geometry as distributions.
Our approach uses diffusion models with a novel network architecture to learn surface point distributions.
We evaluate our representation qualitatively and quantitatively across various object types, demonstrating its effectiveness in achieving high geometric fidelity.
arXiv Detail & Related papers (2024-11-25T04:06:48Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Topological Parallax: A Geometric Specification for Deep Perception
Models [0.778001492222129]
We introduce topological parallax as a theoretical and computational tool that compares a trained model to a reference dataset.
Our examples show that this geometric similarity between dataset and model is essential to trustworthy and perturbation.
This new concept will add value to the current debate regarding the unclear relationship between overfitting and generalization in applications of deep-learning.
arXiv Detail & Related papers (2023-06-20T18:45:24Z) - Model-Based Deep Learning: On the Intersection of Deep Learning and
Optimization [101.32332941117271]
Decision making algorithms are used in a multitude of different applications.
Deep learning approaches that use highly parametric architectures tuned from data without relying on mathematical models are becoming increasingly popular.
Model-based optimization and data-centric deep learning are often considered to be distinct disciplines.
arXiv Detail & Related papers (2022-05-05T13:40:08Z) - A Class of Geometric Structures in Transfer Learning: Minimax Bounds and
Optimality [5.2172436904905535]
We exploit the geometric structure of the source and target domains for transfer learning.
Our proposed estimator outperforms state-of-the-art transfer learning methods in both moderate- and high-dimensional settings.
arXiv Detail & Related papers (2022-02-23T18:47:53Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - GELATO: Geometrically Enriched Latent Model for Offline Reinforcement
Learning [54.291331971813364]
offline reinforcement learning approaches can be divided into proximal and uncertainty-aware methods.
In this work, we demonstrate the benefit of combining the two in a latent variational model.
Our proposed metrics measure both the quality of out of distribution samples as well as the discrepancy of examples in the data.
arXiv Detail & Related papers (2021-02-22T19:42:40Z) - A Deep Learning Approach for Predicting Spatiotemporal Dynamics From
Sparsely Observed Data [10.217447098102165]
We consider the problem of learning prediction models for physical processes driven by unknown partial differential equations (PDEs)
We propose a deep learning framework that learns the underlying dynamics and predicts its evolution using sparsely distributed data sites.
arXiv Detail & Related papers (2020-11-30T16:38:00Z) - Deep Magnification-Flexible Upsampling over 3D Point Clouds [103.09504572409449]
We propose a novel end-to-end learning-based framework to generate dense point clouds.
We first formulate the problem explicitly, which boils down to determining the weights and high-order approximation errors.
Then, we design a lightweight neural network to adaptively learn unified and sorted weights as well as the high-order refinements.
arXiv Detail & Related papers (2020-11-25T14:00:18Z) - Deep Conditional Transformation Models [0.0]
Learning the cumulative distribution function (CDF) of an outcome variable conditional on a set of features remains challenging.
Conditional transformation models provide a semi-parametric approach that allows to model a large class of conditional CDFs.
We propose a novel network architecture, provide details on different model definitions and derive suitable constraints.
arXiv Detail & Related papers (2020-10-15T16:25:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.