Lifted Algorithms for Symmetric Weighted First-Order Model Sampling
- URL: http://arxiv.org/abs/2308.08828v3
- Date: Fri, 14 Jun 2024 07:39:07 GMT
- Title: Lifted Algorithms for Symmetric Weighted First-Order Model Sampling
- Authors: Yuanhong Wang, Juhua Pu, Yuyi Wang, Ondřej Kuželka,
- Abstract summary: We prove the domain-liftability under sampling for the two-variables fragment of first-order logic with counting quantifiers.
We then show that this result continues to hold even in the presence of cardinality constraints.
Our algorithm outperforms a start-of-the-art WMS sampler by a substantial margin.
- Score: 7.007419073974192
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Weighted model counting (WMC) is the task of computing the weighted sum of all satisfying assignments (i.e., models) of a propositional formula. Similarly, weighted model sampling (WMS) aims to randomly generate models with probability proportional to their respective weights. Both WMC and WMS are hard to solve exactly, falling under the $\#\mathsf{P}$-hard complexity class. However, it is known that the counting problem may sometimes be tractable, if the propositional formula can be compactly represented and expressed in first-order logic. In such cases, model counting problems can be solved in time polynomial in the domain size, and are known as domain-liftable. The following question then arises: Is it also the case for weighted model sampling? This paper addresses this question and answers it affirmatively. Specifically, we prove the domain-liftability under sampling for the two-variables fragment of first-order logic with counting quantifiers in this paper, by devising an efficient sampling algorithm for this fragment that runs in time polynomial in the domain size. We then further show that this result continues to hold even in the presence of cardinality constraints. To empirically verify our approach, we conduct experiments over various first-order formulas designed for the uniform generation of combinatorial structures and sampling in statistical-relational models. The results demonstrate that our algorithm outperforms a start-of-the-art WMS sampler by a substantial margin, confirming the theoretical results.
Related papers
- Computational-Statistical Gaps in Gaussian Single-Index Models [77.1473134227844]
Single-Index Models are high-dimensional regression problems with planted structure.
We show that computationally efficient algorithms, both within the Statistical Query (SQ) and the Low-Degree Polynomial (LDP) framework, necessarily require $Omega(dkstar/2)$ samples.
arXiv Detail & Related papers (2024-03-08T18:50:19Z) - Sampling Approximately Low-Rank Ising Models: MCMC meets Variational
Methods [35.24886589614034]
We consider quadratic definite Ising models on the hypercube with a general interaction $J$.
Our general result implies the first time sampling algorithms for low-rank Ising models.
arXiv Detail & Related papers (2022-02-17T21:43:50Z) - Sampling from Arbitrary Functions via PSD Models [55.41644538483948]
We take a two-step approach by first modeling the probability distribution and then sampling from that model.
We show that these models can approximate a large class of densities concisely using few evaluations, and present a simple algorithm to effectively sample from these models.
arXiv Detail & Related papers (2021-10-20T12:25:22Z) - Direct sampling of projected entangled-pair states [0.0]
Variational Monte Carlo studies employing projected entangled-pair states (PEPS) have recently shown that they can provide answers on long-standing questions.
We propose a sampling algorithm that generates independent samples from a PEPS, bypassing all problems related to finite autocorrelation times.
arXiv Detail & Related papers (2021-09-15T15:09:20Z) - Finding Geometric Models by Clustering in the Consensus Space [61.65661010039768]
We propose a new algorithm for finding an unknown number of geometric models, e.g., homographies.
We present a number of applications where the use of multiple geometric models improves accuracy.
These include pose estimation from multiple generalized homographies; trajectory estimation of fast-moving objects.
arXiv Detail & Related papers (2021-03-25T14:35:07Z) - Breaking the Sample Size Barrier in Model-Based Reinforcement Learning
with a Generative Model [50.38446482252857]
This paper is concerned with the sample efficiency of reinforcement learning, assuming access to a generative model (or simulator)
We first consider $gamma$-discounted infinite-horizon Markov decision processes (MDPs) with state space $mathcalS$ and action space $mathcalA$.
We prove that a plain model-based planning algorithm suffices to achieve minimax-optimal sample complexity given any target accuracy level.
arXiv Detail & Related papers (2020-05-26T17:53:18Z) - Active Model Estimation in Markov Decision Processes [108.46146218973189]
We study the problem of efficient exploration in order to learn an accurate model of an environment, modeled as a Markov decision process (MDP)
We show that our Markov-based algorithm outperforms both our original algorithm and the maximum entropy algorithm in the small sample regime.
arXiv Detail & Related papers (2020-03-06T16:17:24Z) - Learning Gaussian Graphical Models via Multiplicative Weights [54.252053139374205]
We adapt an algorithm of Klivans and Meka based on the method of multiplicative weight updates.
The algorithm enjoys a sample complexity bound that is qualitatively similar to others in the literature.
It has a low runtime $O(mp2)$ in the case of $m$ samples and $p$ nodes, and can trivially be implemented in an online manner.
arXiv Detail & Related papers (2020-02-20T10:50:58Z) - On the Approximability of Weighted Model Integration on DNF Structures [13.986963122264632]
We show that weighted model integration on DNF structures can indeed be approximated for a class of weight functions.
Our approximation algorithm is based on three subroutines, each which can be a weak (i.e., approximate), or a strong (i.e., exact) oracle.
arXiv Detail & Related papers (2020-02-17T00:29:41Z) - Approximate Weighted First-Order Model Counting: Exploiting Fast
Approximate Model Counters and Symmetry [16.574508244279098]
We present ApproxWFOMC, a novel anytime method for efficiently bounding the weighted first-order model count.
The algorithm has applications to inference in a variety of first-order probabilistic representations.
We show how our algorithm outperforms existing approximate and exact techniques for inference in first-order probabilistic models.
arXiv Detail & Related papers (2020-01-15T12:21:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.