Marginal Inference queries in Hidden Markov Models under context-free
grammar constraints
- URL: http://arxiv.org/abs/2206.12862v1
- Date: Sun, 26 Jun 2022 12:44:18 GMT
- Title: Marginal Inference queries in Hidden Markov Models under context-free
grammar constraints
- Authors: Reda Marzouk, Colin de La Higuera
- Abstract summary: We address the question of computing the likelihood of context-free grammars (CFGs) in Hidden Models (HMMs)
We show that the problem is NP-Hard, even with the promise that CFG has a degree of ambiguity less than or equal to 2.
We then propose a fully randomized approximation scheme to approximate the likelihood for the case of ambiguous CFGs.
- Score: 0.348097307252416
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The primary use of any probabilistic model involving a set of random
variables is to run inference and sampling queries on it. Inference queries in
classical probabilistic models is concerned by the computation of marginal or
conditional probabilities of events given as an input. When the probabilistic
model is sequential, more sophisticated marginal inference queries involving
complex grammars may be of interest in fields such as computational linguistics
and NLP. In this work, we address the question of computing the likelihood of
context-free grammars (CFGs) in Hidden Markov Models (HMMs). We provide a
dynamic algorithm for the exact computation of the likelihood for the class of
unambiguous context-free grammars. We show that the problem is NP-Hard, even
with the promise that the input CFG has a degree of ambiguity less than or
equal to 2. We then propose a fully polynomial randomized approximation scheme
(FPRAS) algorithm to approximate the likelihood for the case of
polynomially-bounded ambiguous CFGs.
Related papers
- Robust Gaussian Processes via Relevance Pursuit [17.39376866275623]
We propose and study a GP model that achieves robustness against sparse outliers by inferring data-point-specific noise levels.
We show, surprisingly, that the model can be parameterized such that the associated log marginal likelihood is strongly concave in the data-point-specific noise variances.
arXiv Detail & Related papers (2024-10-31T17:59:56Z) - Multivariate Systemic Risk Measures and Computation by Deep Learning
Algorithms [63.03966552670014]
We discuss the key related theoretical aspects, with a particular focus on the fairness properties of primal optima and associated risk allocations.
The algorithms we provide allow for learning primals, optima for the dual representation and corresponding fair risk allocations.
arXiv Detail & Related papers (2023-02-02T22:16:49Z) - Learning to Bound Counterfactual Inference in Structural Causal Models
from Observational and Randomised Data [64.96984404868411]
We derive a likelihood characterisation for the overall data that leads us to extend a previous EM-based algorithm.
The new algorithm learns to approximate the (unidentifiability) region of model parameters from such mixed data sources.
It delivers interval approximations to counterfactual results, which collapse to points in the identifiable case.
arXiv Detail & Related papers (2022-12-06T12:42:11Z) - Predictive Querying for Autoregressive Neural Sequence Models [23.85426261235507]
We introduce a general typology for predictive queries in neural autoregressive sequence models.
We show that such queries can be systematically represented by sets of elementary building blocks.
We leverage this typology to develop new query estimation methods.
arXiv Detail & Related papers (2022-10-12T17:59:36Z) - Partial Counterfactual Identification from Observational and
Experimental Data [83.798237968683]
We develop effective Monte Carlo algorithms to approximate the optimal bounds from an arbitrary combination of observational and experimental data.
Our algorithms are validated extensively on synthetic and real-world datasets.
arXiv Detail & Related papers (2021-10-12T02:21:30Z) - pRSL: Interpretable Multi-label Stacking by Learning Probabilistic Rules [0.0]
We present the probabilistic rule stacking (pRSL) which uses probabilistic propositional logic rules and belief propagation to combine the predictions of several underlying classifiers.
We derive algorithms for exact and approximate inference and learning, and show that pRSL reaches state-of-the-art performance on various benchmark datasets.
arXiv Detail & Related papers (2021-05-28T14:06:21Z) - Probabilistic Generating Circuits [50.98473654244851]
We propose probabilistic generating circuits (PGCs) for their efficient representation.
PGCs are not just a theoretical framework that unifies vastly different existing models, but also show huge potential in modeling realistic data.
We exhibit a simple class of PGCs that are not trivially subsumed by simple combinations of PCs and DPPs, and obtain competitive performance on a suite of density estimation benchmarks.
arXiv Detail & Related papers (2021-02-19T07:06:53Z) - Efficient semidefinite-programming-based inference for binary and
multi-class MRFs [83.09715052229782]
We propose an efficient method for computing the partition function or MAP estimate in a pairwise MRF.
We extend semidefinite relaxations from the typical binary MRF to the full multi-class setting, and develop a compact semidefinite relaxation that can again be solved efficiently using the solver.
arXiv Detail & Related papers (2020-12-04T15:36:29Z) - Tractable Inference in Credal Sentential Decision Diagrams [116.6516175350871]
Probabilistic sentential decision diagrams are logic circuits where the inputs of disjunctive gates are annotated by probability values.
We develop the credal sentential decision diagrams, a generalisation of their probabilistic counterpart that allows for replacing the local probabilities with credal sets of mass functions.
For a first empirical validation, we consider a simple application based on noisy seven-segment display images.
arXiv Detail & Related papers (2020-08-19T16:04:34Z) - Likelihood-Free Gaussian Process for Regression [0.0]
In some cases, we have little knowledge regarding the probability model.
We propose a novel framework called the likelihood-free Gaussian process (LFGP)
We expect that the proposed framework will contribute significantly to likelihood-free modeling.
arXiv Detail & Related papers (2020-06-24T03:38:41Z) - Generating Random Logic Programs Using Constraint Programming [12.47276164048813]
We present a novel approach to generating random logic programs using constraint programming.
We show how the model scales with parameter values, and use the model to compare probabilistic inference algorithms across a range of synthetic problems.
arXiv Detail & Related papers (2020-06-02T19:12:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.