Struct-MMSB: Mixed Membership Stochastic Blockmodels with Interpretable
Structured Priors
- URL: http://arxiv.org/abs/2002.09523v1
- Date: Fri, 21 Feb 2020 19:32:32 GMT
- Title: Struct-MMSB: Mixed Membership Stochastic Blockmodels with Interpretable
Structured Priors
- Authors: Yue Zhang, Arti Ramesh
- Abstract summary: Mixed membership blockmodel (MMSB) is a popular framework for community detection and network generation.
We present a flexible MMSB model, textitStruct-MMSB, that uses a recently developed statistical relational learning model, hinge-loss Markov random fields (HL-MRFs)
Our model is capable of learning latent characteristics in real-world networks via meaningful latent variables encoded as a complex combination of observed features and membership distributions.
- Score: 13.712395104755783
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The mixed membership stochastic blockmodel (MMSB) is a popular framework for
community detection and network generation. It learns a low-rank mixed
membership representation for each node across communities by exploiting the
underlying graph structure. MMSB assumes that the membership distributions of
the nodes are independently drawn from a Dirichlet distribution, which limits
its capability to model highly correlated graph structures that exist in
real-world networks. In this paper, we present a flexible richly structured
MMSB model, \textit{Struct-MMSB}, that uses a recently developed statistical
relational learning model, hinge-loss Markov random fields (HL-MRFs), as a
structured prior to model complex dependencies among node attributes,
multi-relational links, and their relationship with mixed-membership
distributions. Our model is specified using a probabilistic programming
templating language that uses weighted first-order logic rules, which enhances
the model's interpretability. Further, our model is capable of learning latent
characteristics in real-world networks via meaningful latent variables encoded
as a complex combination of observed features and membership distributions. We
present an expectation-maximization based inference algorithm that learns
latent variables and parameters iteratively, a scalable stochastic variation of
the inference algorithm, and a method to learn the weights of HL-MRF structured
priors. We evaluate our model on six datasets across three different types of
networks and corresponding modeling scenarios and demonstrate that our models
are able to achieve an improvement of 15\% on average in test log-likelihood
and faster convergence when compared to state-of-the-art network models.
Related papers
- Federated Learning Aggregation: New Robust Algorithms with Guarantees [63.96013144017572]
Federated learning has been recently proposed for distributed model training at the edge.
This paper presents a complete general mathematical convergence analysis to evaluate aggregation strategies in a federated learning framework.
We derive novel aggregation algorithms which are able to modify their model architecture by differentiating client contributions according to the value of their losses.
arXiv Detail & Related papers (2022-05-22T16:37:53Z) - Bayesian Structure Learning with Generative Flow Networks [85.84396514570373]
In Bayesian structure learning, we are interested in inferring a distribution over the directed acyclic graph (DAG) from data.
Recently, a class of probabilistic models, called Generative Flow Networks (GFlowNets), have been introduced as a general framework for generative modeling.
We show that our approach, called DAG-GFlowNet, provides an accurate approximation of the posterior over DAGs.
arXiv Detail & Related papers (2022-02-28T15:53:10Z) - Low-Rank Constraints for Fast Inference in Structured Models [110.38427965904266]
This work demonstrates a simple approach to reduce the computational and memory complexity of a large class of structured models.
Experiments with neural parameterized structured models for language modeling, polyphonic music modeling, unsupervised grammar induction, and video modeling show that our approach matches the accuracy of standard models at large state spaces.
arXiv Detail & Related papers (2022-01-08T00:47:50Z) - Multi-Scale Semantics-Guided Neural Networks for Efficient
Skeleton-Based Human Action Recognition [140.18376685167857]
A simple yet effective multi-scale semantics-guided neural network is proposed for skeleton-based action recognition.
MS-SGN achieves the state-of-the-art performance on the NTU60, NTU120, and SYSU datasets.
arXiv Detail & Related papers (2021-11-07T03:50:50Z) - Scalable Bayesian Network Structure Learning with Splines [2.741266294612776]
A Bayesian Network (BN) is a probabilistic graphical model consisting of a directed acyclic graph (DAG)
We present a novel approach capable of learning the global DAG structure of a BN and modelling linear and non-linear local relationships between variables.
arXiv Detail & Related papers (2021-10-27T17:54:53Z) - deepregression: a Flexible Neural Network Framework for Semi-Structured
Deep Distributional Regression [1.4909973741292273]
deepregression is implemented in both R and Python, using the deep learning libraries and PyTorch, respectively.
deepregression is implemented in both R and Python, using the deep learning libraries and PyTorch, respectively.
arXiv Detail & Related papers (2021-04-06T17:56:31Z) - Edge-assisted Democratized Learning Towards Federated Analytics [67.44078999945722]
We show the hierarchical learning structure of the proposed edge-assisted democratized learning mechanism, namely Edge-DemLearn.
We also validate Edge-DemLearn as a flexible model training mechanism to build a distributed control and aggregation methodology in regions.
arXiv Detail & Related papers (2020-12-01T11:46:03Z) - Improving the Reconstruction of Disentangled Representation Learners via Multi-Stage Modeling [54.94763543386523]
Current autoencoder-based disentangled representation learning methods achieve disentanglement by penalizing the ( aggregate) posterior to encourage statistical independence of the latent factors.
We present a novel multi-stage modeling approach where the disentangled factors are first learned using a penalty-based disentangled representation learning method.
Then, the low-quality reconstruction is improved with another deep generative model that is trained to model the missing correlated latent variables.
arXiv Detail & Related papers (2020-10-25T18:51:15Z) - Consistency of Spectral Clustering on Hierarchical Stochastic Block
Models [5.983753938303726]
We study the hierarchy of communities in real-world networks under a generic block model.
We prove the strong consistency of this method under a wide range of model parameters.
Unlike most of existing work, our theory covers multiscale networks where the connection probabilities may differ by orders of magnitude.
arXiv Detail & Related papers (2020-04-30T01:08:59Z) - Semi-Structured Distributional Regression -- Extending Structured
Additive Models by Arbitrary Deep Neural Networks and Data Modalities [0.0]
We propose a general framework to combine structured regression models and deep neural networks into a unifying network architecture.
We demonstrate the framework's efficacy in numerical experiments and illustrate its special merits in benchmarks and real-world applications.
arXiv Detail & Related papers (2020-02-13T21:01:26Z) - Fragmentation Coagulation Based Mixed Membership Stochastic Blockmodel [17.35449041036449]
Mixed-Membership Blockmodel(MMSB) is proposed as one of the state-of-the-art Bayesian methods suitable for learning the complex hidden structure underlying the network data.
Our model performs entity-based clustering to capture the community information for entities and linkage-based clustering to derive the group information for links simultaneously.
By integrating the community structure with the group compatibility matrix we derive a generalized version of MMSB.
arXiv Detail & Related papers (2020-01-17T22:02:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.