Variational Flow Graphical Model
- URL: http://arxiv.org/abs/2207.02722v1
- Date: Wed, 6 Jul 2022 14:51:03 GMT
- Title: Variational Flow Graphical Model
- Authors: Shaogang Ren, Belhal Karimi, Dingcheng Li, Ping Li
- Abstract summary: Variational Graphical Flow (VFG) Model learns the representation of high dimensional data via a message-passing scheme.
VFGs produce a representation of the data using a lower dimension, thus overcoming the drawbacks of many flow-based models.
In experiments, VFGs achieves improved evidence lower bound (ELBO) and likelihood values on multiple datasets.
- Score: 22.610974083362606
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper introduces a novel approach to embed flow-based models with
hierarchical structures. The proposed framework is named Variational Flow
Graphical (VFG) Model. VFGs learn the representation of high dimensional data
via a message-passing scheme by integrating flow-based functions through
variational inference. By leveraging the expressive power of neural networks,
VFGs produce a representation of the data using a lower dimension, thus
overcoming the drawbacks of many flow-based models, usually requiring a high
dimensional latent space involving many trivial variables. Aggregation nodes
are introduced in the VFG models to integrate forward-backward hierarchical
information via a message passing scheme. Maximizing the evidence lower bound
(ELBO) of data likelihood aligns the forward and backward messages in each
aggregation node achieving a consistency node state. Algorithms have been
developed to learn model parameters through gradient updating regarding the
ELBO objective.
The consistency of aggregation nodes enable VFGs to be applicable in
tractable inference on graphical structures. Besides representation learning
and numerical inference, VFGs provide a new approach for distribution modeling
on datasets with graphical latent structures. Additionally, theoretical study
shows that VFGs are universal approximators by leveraging the implicitly
invertible flow-based structures. With flexible graphical structures and
superior excessive power, VFGs could potentially be used to improve
probabilistic inference. In the experiments, VFGs achieves improved evidence
lower bound (ELBO) and likelihood values on multiple datasets.
Related papers
- Scalable Weibull Graph Attention Autoencoder for Modeling Document Networks [50.42343781348247]
We develop a graph Poisson factor analysis (GPFA) which provides analytic conditional posteriors to improve the inference accuracy.
We also extend GPFA to a multi-stochastic-layer version named graph Poisson gamma belief network (GPGBN) to capture the hierarchical document relationships at multiple semantic levels.
Our models can extract high-quality hierarchical latent document representations and achieve promising performance on various graph analytic tasks.
arXiv Detail & Related papers (2024-10-13T02:22:14Z) - GRVFL-MV: Graph Random Vector Functional Link Based on Multi-View Learning [0.2999888908665658]
A novel graph random vector functional link based on multi-view learning (GRVFL-MV) model is proposed.
The proposed model is trained on multiple views, incorporating the concept of multiview learning (MVL)
It also incorporates the geometrical properties of all the views using the graph embedding (GE) framework.
arXiv Detail & Related papers (2024-09-07T07:18:08Z) - A Pure Transformer Pretraining Framework on Text-attributed Graphs [50.833130854272774]
We introduce a feature-centric pretraining perspective by treating graph structure as a prior.
Our framework, Graph Sequence Pretraining with Transformer (GSPT), samples node contexts through random walks.
GSPT can be easily adapted to both node classification and link prediction, demonstrating promising empirical success on various datasets.
arXiv Detail & Related papers (2024-06-19T22:30:08Z) - T1: Scaling Diffusion Probabilistic Fields to High-Resolution on Unified
Visual Modalities [69.16656086708291]
Diffusion Probabilistic Field (DPF) models the distribution of continuous functions defined over metric spaces.
We propose a new model comprising of a view-wise sampling algorithm to focus on local structure learning.
The model can be scaled to generate high-resolution data while unifying multiple modalities.
arXiv Detail & Related papers (2023-05-24T03:32:03Z) - Graph Federated Learning for CIoT Devices in Smart Home Applications [23.216140264163535]
We propose a novel Graph Signal Processing (GSP)-inspired aggregation rule based on graph filtering dubbed G-Fedfilt''
The proposed aggregator enables a structured flow of information based on the graph's topology.
It is capable of yielding up to $2.41%$ higher accuracy than FedAvg in the case of testing the generalization of the models.
arXiv Detail & Related papers (2022-12-29T17:57:19Z) - Bayesian Structure Learning with Generative Flow Networks [85.84396514570373]
In Bayesian structure learning, we are interested in inferring a distribution over the directed acyclic graph (DAG) from data.
Recently, a class of probabilistic models, called Generative Flow Networks (GFlowNets), have been introduced as a general framework for generative modeling.
We show that our approach, called DAG-GFlowNet, provides an accurate approximation of the posterior over DAGs.
arXiv Detail & Related papers (2022-02-28T15:53:10Z) - Relational VAE: A Continuous Latent Variable Model for Graph Structured
Data [0.0]
We show applications on the problem of structured probability density modeling for simulated and real wind farm monitoring data.
We release the source code, along with the simulated datasets.
arXiv Detail & Related papers (2021-06-30T13:24:27Z) - Robust Optimization as Data Augmentation for Large-scale Graphs [117.2376815614148]
We propose FLAG (Free Large-scale Adversarial Augmentation on Graphs), which iteratively augments node features with gradient-based adversarial perturbations during training.
FLAG is a general-purpose approach for graph data, which universally works in node classification, link prediction, and graph classification tasks.
arXiv Detail & Related papers (2020-10-19T21:51:47Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.