Tree-AMP: Compositional Inference with Tree Approximate Message Passing
- URL: http://arxiv.org/abs/2004.01571v3
- Date: Sat, 11 Dec 2021 14:02:55 GMT
- Title: Tree-AMP: Compositional Inference with Tree Approximate Message Passing
- Authors: Antoine Baker, Benjamin Aubin, Florent Krzakala, Lenka Zdeborov\'a
- Abstract summary: Tree-AMP is a python package for compositional inference in high-dimensional tree-structured models.
The package provides a unifying framework to study several approximate message passing algorithms.
- Score: 23.509275850721778
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce Tree-AMP, standing for Tree Approximate Message Passing, a
python package for compositional inference in high-dimensional tree-structured
models. The package provides a unifying framework to study several approximate
message passing algorithms previously derived for a variety of machine learning
tasks such as generalized linear models, inference in multi-layer networks,
matrix factorization, and reconstruction using non-separable penalties. For
some models, the asymptotic performance of the algorithm can be theoretically
predicted by the state evolution, and the measurements entropy estimated by the
free entropy formalism. The implementation is modular by design: each module,
which implements a factor, can be composed at will with other modules to solve
complex inference tasks. The user only needs to declare the factor graph of the
model: the inference algorithm, state evolution and entropy estimation are
fully automated.
Related papers
- Approximate learning of parsimonious Bayesian context trees [0.0]
The proposed framework is tested on synthetic and real-world data examples.
It outperforms existing sequence models when fitted to real protein sequences and honeypot computer terminal sessions.
arXiv Detail & Related papers (2024-07-27T11:50:40Z) - Terminating Differentiable Tree Experts [77.2443883991608]
We propose a neuro-symbolic Differentiable Tree Machine that learns tree operations using a combination of transformers and Representation Products.
We first remove a series of different transformer layers that are used in every step by introducing a mixture of experts.
We additionally propose a new termination algorithm to provide the model the power to choose how many steps to make automatically.
arXiv Detail & Related papers (2024-07-02T08:45:38Z) - Beyond TreeSHAP: Efficient Computation of Any-Order Shapley Interactions
for Tree Ensembles [6.664930499708017]
The Shapley value (SV) is a concept in explainable artificial intelligence (XAI) research for quantifying additive feature attributions of predictions.
We present TreeSHAP-IQ, an efficient method to compute any-order additive Shapley interactions for predictions tree-based models.
arXiv Detail & Related papers (2024-01-22T16:08:41Z) - A Recursive Bateson-Inspired Model for the Generation of Semantic Formal
Concepts from Spatial Sensory Data [77.34726150561087]
This paper presents a new symbolic-only method for the generation of hierarchical concept structures from complex sensory data.
The approach is based on Bateson's notion of difference as the key to the genesis of an idea or a concept.
The model is able to produce fairly rich yet human-readable conceptual representations without training.
arXiv Detail & Related papers (2023-07-16T15:59:13Z) - Compositional Probabilistic and Causal Inference using Tractable Circuit
Models [20.07977560803858]
We introduce md-vtrees, a novel structural formulation of (marginal) determinism in structured decomposable PCs.
We derive the first polytime algorithms for causal inference queries such as backdoor adjustment on PCs.
arXiv Detail & Related papers (2023-04-17T13:48:16Z) - Unboxing Tree Ensembles for interpretability: a hierarchical
visualization tool and a multivariate optimal re-built tree [0.34530027457862006]
We develop an interpretable representation of a tree-ensemble model that can provide valuable insights into its behavior.
The proposed model is effective in yielding a shallow interpretable tree approxing the tree-ensemble decision function.
arXiv Detail & Related papers (2023-02-15T10:43:31Z) - A General Framework for Sample-Efficient Function Approximation in
Reinforcement Learning [132.45959478064736]
We propose a general framework that unifies model-based and model-free reinforcement learning.
We propose a novel estimation function with decomposable structural properties for optimization-based exploration.
Under our framework, a new sample-efficient algorithm namely OPtimization-based ExploRation with Approximation (OPERA) is proposed.
arXiv Detail & Related papers (2022-09-30T17:59:16Z) - Summarization Programs: Interpretable Abstractive Summarization with
Neural Modular Trees [89.60269205320431]
Current abstractive summarization models either suffer from a lack of clear interpretability or provide incomplete rationales.
We propose the Summarization Program (SP), an interpretable modular framework consisting of an (ordered) list of binary trees.
A Summarization Program contains one root node per summary sentence, and a distinct tree connects each summary sentence to the document sentences.
arXiv Detail & Related papers (2022-09-21T16:50:22Z) - PAC Reinforcement Learning for Predictive State Representations [60.00237613646686]
We study online Reinforcement Learning (RL) in partially observable dynamical systems.
We focus on the Predictive State Representations (PSRs) model, which is an expressive model that captures other well-known models.
We develop a novel model-based algorithm for PSRs that can learn a near optimal policy in sample complexity scalingly.
arXiv Detail & Related papers (2022-07-12T17:57:17Z) - TreeFlow: Going beyond Tree-based Gaussian Probabilistic Regression [0.0]
We introduce TreeFlow, the tree-based approach that combines the benefits of using tree ensembles with the capabilities of modeling flexible probability distributions.
We evaluate the proposed method on challenging regression benchmarks with varying volume, feature characteristics, and target dimensionality.
arXiv Detail & Related papers (2022-06-08T20:06:23Z) - Polynomial-Time Exact MAP Inference on Discrete Models with Global
Dependencies [83.05591911173332]
junction tree algorithm is the most general solution for exact MAP inference with run-time guarantees.
We propose a new graph transformation technique via node cloning which ensures a run-time for solving our target problem independently of the form of a corresponding clique tree.
arXiv Detail & Related papers (2019-12-27T13:30:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.