$\mathcal{F}$-EBM: Energy Based Learning of Functional Data
- URL: http://arxiv.org/abs/2202.01929v1
- Date: Fri, 4 Feb 2022 01:01:50 GMT
- Title: $\mathcal{F}$-EBM: Energy Based Learning of Functional Data
- Authors: Jen Ning Lim, Sebastian Vollmer, Lorenz Wolf, Andrew Duncan
- Abstract summary: Energy-Based Models (EBMs) have proven to be a highly effective approach for modelling densities on finite-dimensional spaces.
We present a novel class of EBM which is able to learn distributions of functions from functional samples evaluated at finitely many points.
- Score: 1.0896567381206714
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Energy-Based Models (EBMs) have proven to be a highly effective approach for
modelling densities on finite-dimensional spaces. Their ability to incorporate
domain-specific choices and constraints into the structure of the model through
composition make EBMs an appealing candidate for applications in physics,
biology and computer vision and various other fields. In this work, we present
a novel class of EBM which is able to learn distributions of functions (such as
curves or surfaces) from functional samples evaluated at finitely many points.
Two unique challenges arise in the functional context. Firstly, training data
is often not evaluated along a fixed set of points. Secondly, steps must be
taken to control the behaviour of the model between evaluation points, to
mitigate overfitting. The proposed infinite-dimensional EBM employs a latent
Gaussian process, which is weighted spectrally by an energy function
parameterised with a neural network. The resulting EBM has the ability to
utilize irregularly sampled training data and can output predictions at any
resolution, providing an effective approach to up-scaling functional data. We
demonstrate the efficacy of our proposed approach for modelling a range of
datasets, including data collected from Standard and Poor's 500 (S\&P) and UK
National grid.
Related papers
- LSEBMCL: A Latent Space Energy-Based Model for Continual Learning [20.356926275395004]
The study demonstrates the efficacy of EBM in NLP tasks, achieving state-of-the-art results in all experiments.
The proposed solution LSEBMCL (Latent Space Energy-Based Model for Continual Learning) in this work is to use energy-based models (EBMs) to prevent catastrophic forgetting.
arXiv Detail & Related papers (2025-01-09T15:47:30Z) - Energy-Based Modelling for Discrete and Mixed Data via Heat Equations on Structured Spaces [19.92604781654767]
Energy-based models (EBMs) offer a flexible framework for probabilistic modelling across various data domains.
We propose to train discrete EBMs with Energy Discrepancy, a loss function which only requires the evaluation of the energy function at data points.
arXiv Detail & Related papers (2024-12-02T00:35:29Z) - DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - CoCoGen: Physically-Consistent and Conditioned Score-based Generative Models for Forward and Inverse Problems [1.0923877073891446]
This work extends the reach of generative models into physical problem domains.
We present an efficient approach to promote consistency with the underlying PDE.
We showcase the potential and versatility of score-based generative models in various physics tasks.
arXiv Detail & Related papers (2023-12-16T19:56:10Z) - Energy-Based Models for Anomaly Detection: A Manifold Diffusion Recovery
Approach [12.623417770432146]
We present a new method of training energy-based models (EBMs) for anomaly detection that leverages low-dimensional structures within data.
The proposed algorithm, Manifold Projection-Diffusion Recovery (MPDR), first perturbs a data point along a low-dimensional manifold that approximates the training dataset.
Experimental results show that MPDR exhibits strong performance across various anomaly detection tasks involving diverse data types.
arXiv Detail & Related papers (2023-10-28T11:18:39Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.
We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.
Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - Mixed Effects Neural ODE: A Variational Approximation for Analyzing the
Dynamics of Panel Data [50.23363975709122]
We propose a probabilistic model called ME-NODE to incorporate (fixed + random) mixed effects for analyzing panel data.
We show that our model can be derived using smooth approximations of SDEs provided by the Wong-Zakai theorem.
We then derive Evidence Based Lower Bounds for ME-NODE, and develop (efficient) training algorithms.
arXiv Detail & Related papers (2022-02-18T22:41:51Z) - Towards Open-World Feature Extrapolation: An Inductive Graph Learning
Approach [80.8446673089281]
We propose a new learning paradigm with graph representation and learning.
Our framework contains two modules: 1) a backbone network (e.g., feedforward neural nets) as a lower model takes features as input and outputs predicted labels; 2) a graph neural network as an upper model learns to extrapolate embeddings for new features via message passing over a feature-data graph built from observed data.
arXiv Detail & Related papers (2021-10-09T09:02:45Z) - Energy-Efficient and Federated Meta-Learning via Projected Stochastic
Gradient Ascent [79.58680275615752]
We propose an energy-efficient federated meta-learning framework.
We assume each task is owned by a separate agent, so a limited number of tasks is used to train a meta-model.
arXiv Detail & Related papers (2021-05-31T08:15:44Z) - On Energy-Based Models with Overparametrized Shallow Neural Networks [44.74000986284978]
Energy-based models (EBMs) are a powerful framework for generative modeling.
In this work we focus on shallow neural networks.
We show that models trained in the so-called "active" regime provide a statistical advantage over their associated "lazy" or kernel regime.
arXiv Detail & Related papers (2021-04-15T15:34:58Z) - Learning Discrete Energy-based Models via Auxiliary-variable Local
Exploration [130.89746032163106]
We propose ALOE, a new algorithm for learning conditional and unconditional EBMs for discrete structured data.
We show that the energy function and sampler can be trained efficiently via a new variational form of power iteration.
We present an energy model guided fuzzer for software testing that achieves comparable performance to well engineered fuzzing engines like libfuzzer.
arXiv Detail & Related papers (2020-11-10T19:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.