Partition Function Estimation: A Quantitative Study
- URL: http://arxiv.org/abs/2105.11132v1
- Date: Mon, 24 May 2021 07:25:43 GMT
- Title: Partition Function Estimation: A Quantitative Study
- Authors: Durgesh Agrawal and Yash Pote and Kuldeep S Meel
- Abstract summary: A graphical model's partition function is a central quantity of interest.
Several techniques have been proposed over the years with varying guarantees on the quality of estimates.
Our empirical study draws up a surprising observation: exact techniques are as efficient as the approximate ones.
- Score: 25.782420501870295
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Probabilistic graphical models have emerged as a powerful modeling tool for
several real-world scenarios where one needs to reason under uncertainty. A
graphical model's partition function is a central quantity of interest, and its
computation is key to several probabilistic reasoning tasks. Given the
#P-hardness of computing the partition function, several techniques have been
proposed over the years with varying guarantees on the quality of estimates and
their runtime behavior. This paper seeks to present a survey of 18 techniques
and a rigorous empirical study of their behavior across an extensive set of
benchmarks. Our empirical study draws up a surprising observation: exact
techniques are as efficient as the approximate ones, and therefore, we conclude
with an optimistic view of opportunities for the design of approximate
techniques with enhanced scalability. Motivated by the observation of an order
of magnitude difference between the Virtual Best Solver and the best performing
tool, we envision an exciting line of research focused on the development of
portfolio solvers.
Related papers
- Parameter-Efficient Active Learning for Foundational models [7.799711162530711]
Foundational vision transformer models have shown impressive few shot performance on many vision tasks.
This research presents a novel investigation into the application of parameter efficient fine-tuning methods within an active learning (AL) framework.
arXiv Detail & Related papers (2024-06-13T16:30:32Z) - Gradient Estimation and Variance Reduction in Stochastic and Deterministic Models [0.0]
This dissertation considers unconstrained, nonlinear optimization problems.
We focus on the gradient itself, that key quantity which enables the solution of such problems.
We present a new framework for calculating the gradient of problems involving both deterministic and elements.
arXiv Detail & Related papers (2024-05-14T14:41:58Z) - Efficient Model-Free Exploration in Low-Rank MDPs [76.87340323826945]
Low-Rank Markov Decision Processes offer a simple, yet expressive framework for RL with function approximation.
Existing algorithms are either (1) computationally intractable, or (2) reliant upon restrictive statistical assumptions.
We propose the first provably sample-efficient algorithm for exploration in Low-Rank MDPs.
arXiv Detail & Related papers (2023-07-08T15:41:48Z) - Latent Variable Representation for Reinforcement Learning [131.03944557979725]
It remains unclear theoretically and empirically how latent variable models may facilitate learning, planning, and exploration to improve the sample efficiency of model-based reinforcement learning.
We provide a representation view of the latent variable models for state-action value functions, which allows both tractable variational learning algorithm and effective implementation of the optimism/pessimism principle.
In particular, we propose a computationally efficient planning algorithm with UCB exploration by incorporating kernel embeddings of latent variable models.
arXiv Detail & Related papers (2022-12-17T00:26:31Z) - Offline Reinforcement Learning with Differentiable Function
Approximation is Provably Efficient [65.08966446962845]
offline reinforcement learning, which aims at optimizing decision-making strategies with historical data, has been extensively applied in real-life applications.
We take a step by considering offline reinforcement learning with differentiable function class approximation (DFA)
Most importantly, we show offline differentiable function approximation is provably efficient by analyzing the pessimistic fitted Q-learning algorithm.
arXiv Detail & Related papers (2022-10-03T07:59:42Z) - Bayesian Graph Contrastive Learning [55.36652660268726]
We propose a novel perspective of graph contrastive learning methods showing random augmentations leads to encoders.
Our proposed method represents each node by a distribution in the latent space in contrast to existing techniques which embed each node to a deterministic vector.
We show a considerable improvement in performance compared to existing state-of-the-art methods on several benchmark datasets.
arXiv Detail & Related papers (2021-12-15T01:45:32Z) - What Image Features Boost Housing Market Predictions? [81.32205133298254]
We propose a set of techniques for the extraction of visual features for efficient numerical inclusion in predictive algorithms.
We discuss techniques such as Shannon's entropy, calculating the center of gravity, employing image segmentation, and using Convolutional Neural Networks.
The set of 40 image features selected here carries a significant amount of predictive power and outperforms some of the strongest metadata predictors.
arXiv Detail & Related papers (2021-07-15T06:32:10Z) - How to Design Sample and Computationally Efficient VQA Models [53.65668097847456]
We find that representing the text as probabilistic programs and images as object-level scene graphs best satisfy these desiderata.
We extend existing models to leverage these soft programs and scene graphs to train on question answer pairs in an end-to-end manner.
arXiv Detail & Related papers (2021-03-22T01:48:16Z) - Scalable Approximate Inference and Some Applications [2.6541211006790983]
In this thesis, we propose a new framework for approximate inference.
Our proposed four algorithms are motivated by the recent computational progress of Stein's method.
Results on simulated and real datasets indicate the statistical efficiency and wide applicability of our algorithm.
arXiv Detail & Related papers (2020-03-07T04:33:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.