Nondiagonal Mixture of Dirichlet Network Distributions for Analyzing a
Stock Ownership Network
- URL: http://arxiv.org/abs/2009.04446v2
- Date: Sun, 1 Nov 2020 12:10:41 GMT
- Title: Nondiagonal Mixture of Dirichlet Network Distributions for Analyzing a
Stock Ownership Network
- Authors: Wenning Zhang, Ryohei Hisano, Takaaki Ohnishi, Takayuki Mizuno
- Abstract summary: Block modeling is widely used in studies on complex networks.
We provide an edge exchangeable block model that incorporates basic features and simultaneously infers the latent block structure of a given complex network.
Our model is a Bayesian nonparametric model that flexibly estimates the number of blocks and takes into account the possibility of unseen nodes.
- Score: 3.0969191504482243
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Block modeling is widely used in studies on complex networks. The cornerstone
model is the stochastic block model (SBM), widely used over the past decades.
However, the SBM is limited in analyzing complex networks as the model is, in
essence, a random graph model that cannot reproduce the basic properties of
many complex networks, such as sparsity and heavy-tailed degree distribution.
In this paper, we provide an edge exchangeable block model that incorporates
such basic features and simultaneously infers the latent block structure of a
given complex network. Our model is a Bayesian nonparametric model that
flexibly estimates the number of blocks and takes into account the possibility
of unseen nodes. Using one synthetic dataset and one real-world stock ownership
dataset, we show that our model outperforms state-of-the-art SBMs for held-out
link prediction tasks.
Related papers
- SpaceMesh: A Continuous Representation for Learning Manifold Surface Meshes [61.110517195874074]
We present a scheme to directly generate manifold, polygonal meshes of complex connectivity as the output of a neural network.
Our key innovation is to define a continuous latent connectivity space at each mesh, which implies the discrete mesh.
In applications, this approach not only yields high-quality outputs from generative models, but also enables directly learning challenging geometry processing tasks such as mesh repair.
arXiv Detail & Related papers (2024-09-30T17:59:03Z) - Vertical Layering of Quantized Neural Networks for Heterogeneous
Inference [57.42762335081385]
We study a new vertical-layered representation of neural network weights for encapsulating all quantized models into a single one.
We can theoretically achieve any precision network for on-demand service while only needing to train and maintain one model.
arXiv Detail & Related papers (2022-12-10T15:57:38Z) - Parameter-Efficient Masking Networks [61.43995077575439]
Advanced network designs often contain a large number of repetitive structures (e.g., Transformer)
In this study, we are the first to investigate the representative potential of fixed random weights with limited unique values by learning masks.
It leads to a new paradigm for model compression to diminish the model size.
arXiv Detail & Related papers (2022-10-13T03:39:03Z) - The Multivariate Community Hawkes Model for Dependent Relational Events
in Continuous-time Networks [3.55528500800612]
The block model (SBM) is one of the most widely used generative models for network data.
We propose the multivariate community Hawkes (MULCH) model, an extremely flexible community-based model for continuous-time networks.
We find that our proposed MULCH model is far more accurate than existing models both for predictive and predictive generative tasks.
arXiv Detail & Related papers (2022-05-02T04:08:44Z) - Multiblock-Networks: A Neural Network Analog to Component Based Methods
for Multi-Source Data [0.0]
We propose a setup to transfer the concepts of component based statistical models to neural network architectures.
Thereby, we combine the flexibility of neural networks with the concepts for interpreting block relevance in multiblock methods.
Our results underline that multiblock networks allow for basic model interpretation while matching the performance of ordinary feed-forward neural networks.
arXiv Detail & Related papers (2021-09-21T16:00:15Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Tractably Modelling Dependence in Networks Beyond Exchangeability [0.0]
We study the estimation, clustering and degree behavior of the network in our setting.
This explores why and under which general conditions non-exchangeable network data can be described by a block model.
arXiv Detail & Related papers (2020-07-28T17:13:59Z) - Consistency of Spectral Clustering on Hierarchical Stochastic Block
Models [5.983753938303726]
We study the hierarchy of communities in real-world networks under a generic block model.
We prove the strong consistency of this method under a wide range of model parameters.
Unlike most of existing work, our theory covers multiscale networks where the connection probabilities may differ by orders of magnitude.
arXiv Detail & Related papers (2020-04-30T01:08:59Z) - Belief Propagation Reloaded: Learning BP-Layers for Labeling Problems [83.98774574197613]
We take one of the simplest inference methods, a truncated max-product Belief propagation, and add what is necessary to make it a proper component of a deep learning model.
This BP-Layer can be used as the final or an intermediate block in convolutional neural networks (CNNs)
The model is applicable to a range of dense prediction problems, is well-trainable and provides parameter-efficient and robust solutions in stereo, optical flow and semantic segmentation.
arXiv Detail & Related papers (2020-03-13T13:11:35Z) - Struct-MMSB: Mixed Membership Stochastic Blockmodels with Interpretable
Structured Priors [13.712395104755783]
Mixed membership blockmodel (MMSB) is a popular framework for community detection and network generation.
We present a flexible MMSB model, textitStruct-MMSB, that uses a recently developed statistical relational learning model, hinge-loss Markov random fields (HL-MRFs)
Our model is capable of learning latent characteristics in real-world networks via meaningful latent variables encoded as a complex combination of observed features and membership distributions.
arXiv Detail & Related papers (2020-02-21T19:32:32Z) - Kernel and Rich Regimes in Overparametrized Models [69.40899443842443]
We show that gradient descent on overparametrized multilayer networks can induce rich implicit biases that are not RKHS norms.
We also demonstrate this transition empirically for more complex matrix factorization models and multilayer non-linear networks.
arXiv Detail & Related papers (2020-02-20T15:43:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.