Relational VAE: A Continuous Latent Variable Model for Graph Structured
Data
- URL: http://arxiv.org/abs/2106.16049v1
- Date: Wed, 30 Jun 2021 13:24:27 GMT
- Title: Relational VAE: A Continuous Latent Variable Model for Graph Structured
Data
- Authors: Charilaos Mylonas, Imad Abdallah and Eleni Chatzi
- Abstract summary: We show applications on the problem of structured probability density modeling for simulated and real wind farm monitoring data.
We release the source code, along with the simulated datasets.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph Networks (GNs) enable the fusion of prior knowledge and relational
reasoning with flexible function approximations. In this work, a general
GN-based model is proposed which takes full advantage of the relational
modeling capabilities of GNs and extends these to probabilistic modeling with
Variational Bayes (VB). To that end, we combine complementary pre-existing
approaches on VB for graph data and propose an approach that relies on
graph-structured latent and conditioning variables. It is demonstrated that
Neural Processes can also be viewed through the lens of the proposed model. We
show applications on the problem of structured probability density modeling for
simulated and real wind farm monitoring data, as well as on the meta-learning
of simulated Gaussian Process data. We release the source code, along with the
simulated datasets.
Related papers
- Fusion of Gaussian Processes Predictions with Monte Carlo Sampling [61.31380086717422]
In science and engineering, we often work with models designed for accurate prediction of variables of interest.
Recognizing that these models are approximations of reality, it becomes desirable to apply multiple models to the same data and integrate their outcomes.
arXiv Detail & Related papers (2024-03-03T04:21:21Z) - A Priori Uncertainty Quantification of Reacting Turbulence Closure Models using Bayesian Neural Networks [0.0]
We employ Bayesian neural networks to capture uncertainties in a reacting flow model.
We demonstrate that BNN models can provide unique insights about the structure of uncertainty of the data-driven closure models.
The efficacy of the model is demonstrated by a priori evaluation on a dataset consisting of a variety of flame conditions and fuels.
arXiv Detail & Related papers (2024-02-28T22:19:55Z) - Cyclic Directed Probabilistic Graphical Model: A Proposal Based on
Structured Outcomes [0.0]
We describe a probabilistic graphical model - probabilistic relation network - that allows the direct capture of directional cyclic dependencies.
This model does not violate the probability axioms, and it supports learning from observed data.
Notably, it supports probabilistic inference, making it a prospective tool in data analysis and in expert and design-making applications.
arXiv Detail & Related papers (2023-10-25T10:19:03Z) - T1: Scaling Diffusion Probabilistic Fields to High-Resolution on Unified
Visual Modalities [69.16656086708291]
Diffusion Probabilistic Field (DPF) models the distribution of continuous functions defined over metric spaces.
We propose a new model comprising of a view-wise sampling algorithm to focus on local structure learning.
The model can be scaled to generate high-resolution data while unifying multiple modalities.
arXiv Detail & Related papers (2023-05-24T03:32:03Z) - Distributed Bayesian Learning of Dynamic States [65.7870637855531]
The proposed algorithm is a distributed Bayesian filtering task for finite-state hidden Markov models.
It can be used for sequential state estimation, as well as for modeling opinion formation over social networks under dynamic environments.
arXiv Detail & Related papers (2022-12-05T19:40:17Z) - Neural Graphical Models [2.6842860806280058]
We introduce Neural Graphical Models (NGMs) to represent complex feature dependencies with reasonable computational costs.
We capture the dependency structure between the features along with their complex function representations by using a neural network as a multi-task learning framework.
NGMs can fit generic graph structures including directed, undirected and mixed-edge graphs as well as support mixed input data types.
arXiv Detail & Related papers (2022-10-02T07:59:51Z) - Bayesian Structure Learning with Generative Flow Networks [85.84396514570373]
In Bayesian structure learning, we are interested in inferring a distribution over the directed acyclic graph (DAG) from data.
Recently, a class of probabilistic models, called Generative Flow Networks (GFlowNets), have been introduced as a general framework for generative modeling.
We show that our approach, called DAG-GFlowNet, provides an accurate approximation of the posterior over DAGs.
arXiv Detail & Related papers (2022-02-28T15:53:10Z) - Mixed Effects Neural ODE: A Variational Approximation for Analyzing the
Dynamics of Panel Data [50.23363975709122]
We propose a probabilistic model called ME-NODE to incorporate (fixed + random) mixed effects for analyzing panel data.
We show that our model can be derived using smooth approximations of SDEs provided by the Wong-Zakai theorem.
We then derive Evidence Based Lower Bounds for ME-NODE, and develop (efficient) training algorithms.
arXiv Detail & Related papers (2022-02-18T22:41:51Z) - Functional Mixtures-of-Experts [0.24578723416255746]
We consider the statistical analysis of heterogeneous data for prediction in situations where the observations include functions.
We first present a new family of ME models, named functional ME (FME) in which the predictors are potentially noisy observations.
We develop dedicated expectation--maximization algorithms for Lasso-like (EM-Lasso) regularized maximum-likelihood parameter estimation strategies to fit the models.
arXiv Detail & Related papers (2022-02-04T17:32:28Z) - GNisi: A graph network for reconstructing Ising models from multivariate
binarized data [0.0]
We present a novel method for the determination of Ising parameters from data, called GNisi, which uses a Graph Neural network trained on known Ising models.
We show that GNisi is more accurate than the existing state of the art software, and we illustrate our method by applying GNisi to gene expression data.
arXiv Detail & Related papers (2021-09-09T13:27:40Z) - Bayesian Sparse Factor Analysis with Kernelized Observations [67.60224656603823]
Multi-view problems can be faced with latent variable models.
High-dimensionality and non-linear issues are traditionally handled by kernel methods.
We propose merging both approaches into single model.
arXiv Detail & Related papers (2020-06-01T14:25:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.