Generalized Precision Matrix for Scalable Estimation of Nonparametric
Markov Networks
- URL: http://arxiv.org/abs/2305.11379v1
- Date: Fri, 19 May 2023 01:53:10 GMT
- Title: Generalized Precision Matrix for Scalable Estimation of Nonparametric
Markov Networks
- Authors: Yujia Zheng, Ignavier Ng, Yewen Fan, Kun Zhang
- Abstract summary: A Markov network characterizes the conditional independence structure, or Markov property, among a set of random variables.
In this work, we characterize the conditional independence structure in general distributions for all data types.
We also allow general functional relations among variables, thus giving rise to a Markov network structure learning algorithm.
- Score: 11.77890309304632
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A Markov network characterizes the conditional independence structure, or
Markov property, among a set of random variables. Existing work focuses on
specific families of distributions (e.g., exponential families) and/or certain
structures of graphs, and most of them can only handle variables of a single
data type (continuous or discrete). In this work, we characterize the
conditional independence structure in general distributions for all data types
(i.e., continuous, discrete, and mixed-type) with a Generalized Precision
Matrix (GPM). Besides, we also allow general functional relations among
variables, thus giving rise to a Markov network structure learning algorithm in
one of the most general settings. To deal with the computational challenge of
the problem, especially for large graphs, we unify all cases under the same
umbrella of a regularized score matching framework. We validate the theoretical
results and demonstrate the scalability empirically in various settings.
Related papers
- Towards Stable, Globally Expressive Graph Representations with Laplacian Eigenvectors [29.055130767451036]
We propose a novel method exploiting Laplacian eigenvectors to generate stable and globally expressive graph representations.
Our method deals with numerically close eigenvalues in a smooth fashion, ensuring its better robustness against perturbations.
arXiv Detail & Related papers (2024-10-13T06:02:25Z) - Signed Diverse Multiplex Networks: Clustering and Inference [4.070200285321219]
The setting is extended to a multiplex version, where all layers have the same collection of nodes and follow the SGRDPG.
The paper fulfills two objectives. First, it shows that keeping signs of the edges in the process of network construction leads to a better precision of estimation and clustering.
Second, by employing novel algorithms, our paper ensures strongly consistent clustering of layers and high accuracy of subspace estimation.
arXiv Detail & Related papers (2024-02-14T19:37:30Z) - iSCAN: Identifying Causal Mechanism Shifts among Nonlinear Additive
Noise Models [48.33685559041322]
This paper focuses on identifying the causal mechanism shifts in two or more related datasets over the same set of variables.
Code implementing the proposed method is open-source and publicly available at https://github.com/kevinsbello/iSCAN.
arXiv Detail & Related papers (2023-06-30T01:48:11Z) - Invariance Principle Meets Out-of-Distribution Generalization on Graphs [66.04137805277632]
Complex nature of graphs thwarts the adoption of the invariance principle for OOD generalization.
domain or environment partitions, which are often required by OOD methods, can be expensive to obtain for graphs.
We propose a novel framework to explicitly model this process using a contrastive strategy.
arXiv Detail & Related papers (2022-02-11T04:38:39Z) - Frame Averaging for Invariant and Equivariant Network Design [50.87023773850824]
We introduce Frame Averaging (FA), a framework for adapting known (backbone) architectures to become invariant or equivariant to new symmetry types.
We show that FA-based models have maximal expressive power in a broad setting.
We propose a new class of universal Graph Neural Networks (GNNs), universal Euclidean motion invariant point cloud networks, and Euclidean motion invariant Message Passing (MP) GNNs.
arXiv Detail & Related papers (2021-10-07T11:05:23Z) - Diagonal Nonlinear Transformations Preserve Structure in Covariance and
Precision Matrices [3.652509571098291]
For a certain class of non-Gaussian distributions, correspondences still hold, exactly for the covariance and approximately for the precision.
The distributions -- sometimes referred to as "nonparanormal" -- are given by diagonal transformations of multivariate normal random variables.
arXiv Detail & Related papers (2021-07-08T22:31:48Z) - Statistical Analysis from the Fourier Integral Theorem [9.619814126465206]
We look at Monte Carlo based estimators of conditional distribution functions.
We study a number of problems, such as prediction for Markov processes.
Estimators are explicit Monte Carlo based and require no iterative algorithms.
arXiv Detail & Related papers (2021-06-11T20:44:54Z) - Structure Learning of Contextual Markov Networks using Marginal
Pseudo-likelihood [5.364120183147694]
We introduce the marginal pseudo-likelihood as an analytically tractable criterion for general contextual Markov networks.
Our criterion is shown to yield a consistent structure estimator.
arXiv Detail & Related papers (2021-03-29T12:13:15Z) - Joint Network Topology Inference via Structured Fusion Regularization [70.30364652829164]
Joint network topology inference represents a canonical problem of learning multiple graph Laplacian matrices from heterogeneous graph signals.
We propose a general graph estimator based on a novel structured fusion regularization.
We show that the proposed graph estimator enjoys both high computational efficiency and rigorous theoretical guarantee.
arXiv Detail & Related papers (2021-03-05T04:42:32Z) - Building powerful and equivariant graph neural networks with structural
message-passing [74.93169425144755]
We propose a powerful and equivariant message-passing framework based on two ideas.
First, we propagate a one-hot encoding of the nodes, in addition to the features, in order to learn a local context matrix around each node.
Second, we propose methods for the parametrization of the message and update functions that ensure permutation equivariance.
arXiv Detail & Related papers (2020-06-26T17:15:16Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.