Axiomatic characterization of pointwise Shapley decompositions
- URL: http://arxiv.org/abs/2303.07773v1
- Date: Tue, 14 Mar 2023 10:24:48 GMT
- Title: Axiomatic characterization of pointwise Shapley decompositions
- Authors: Marcus C Christiansen
- Abstract summary: A common problem in various applications is the additive decomposition of the output of a function with respect to its input variables.
In this paper, axioms are developed which fully preserve functional structures and lead to unique decompositions for all Borel measurable functions.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A common problem in various applications is the additive decomposition of the
output of a function with respect to its input variables. Functions with binary
arguments can be axiomatically decomposed by the famous Shapley value. For the
decomposition of functions with real arguments, a popular method is the
pointwise application of the Shapley value on the domain. However, this
pointwise application largely ignores the overall structure of functions. In
this paper, axioms are developed which fully preserve functional structures and
lead to unique decompositions for all Borel measurable functions.
Related papers
- Function Trees: Transparent Machine Learning [1.3597551064547502]
Knowing the global properties of such functions can help in understanding the system that produced the data.
A function tree is constructed that can be used to rapidly identify and compute all of the function's main and interaction effects.
arXiv Detail & Related papers (2024-03-19T20:23:31Z) - Piecewise Polynomial Regression of Tame Functions via Integer Programming [2.2499166814992435]
We consider tame functions, nonsmooth functions with all common activations, value functions of mixed-integer programs, or wave functions of small molecules.
arXiv Detail & Related papers (2023-11-22T17:37:42Z) - Hoeffding decomposition of black-box models with dependent inputs [30.076357972854723]
We generalize Hoeffding's decomposition for dependent inputs under mild conditions.
We show that any square-integrable, real-valued function of random elements respecting two assumptions can be uniquely additively and offer a characterization.
arXiv Detail & Related papers (2023-10-10T12:28:53Z) - Refining and relating fundamentals of functional theory [0.0]
We explain why there exist six equivalent universal functionals, prove concise relations among them and conclude that the important notion of $v$-representability is relative to the scope and choice of variable.
For systems with time-reversal symmetry, we explain why there exist six equivalent universal functionals, prove concise relations among them and conclude that the important notion of $v$-representability is relative to the scope and choice of variable.
arXiv Detail & Related papers (2023-01-24T18:09:47Z) - Reinforcement Learning from Partial Observation: Linear Function Approximation with Provable Sample Efficiency [111.83670279016599]
We study reinforcement learning for partially observed decision processes (POMDPs) with infinite observation and state spaces.
We make the first attempt at partial observability and function approximation for a class of POMDPs with a linear structure.
arXiv Detail & Related papers (2022-04-20T21:15:38Z) - Learning Aggregation Functions [78.47770735205134]
We introduce LAF (Learning Aggregation Functions), a learnable aggregator for sets of arbitrary cardinality.
We report experiments on semi-synthetic and real data showing that LAF outperforms state-of-the-art sum- (max-) decomposition architectures.
arXiv Detail & Related papers (2020-12-15T18:28:53Z) - Finite-Function-Encoding Quantum States [52.77024349608834]
We introduce finite-function-encoding (FFE) states which encode arbitrary $d$-valued logic functions.
We investigate some of their structural properties.
arXiv Detail & Related papers (2020-12-01T13:53:23Z) - A Functional Perspective on Learning Symmetric Functions with Neural
Networks [48.80300074254758]
We study the learning and representation of neural networks defined on measures.
We establish approximation and generalization bounds under different choices of regularization.
The resulting models can be learned efficiently and enjoy generalization guarantees that extend across input sizes.
arXiv Detail & Related papers (2020-08-16T16:34:33Z) - From Sets to Multisets: Provable Variational Inference for Probabilistic
Integer Submodular Models [82.95892656532696]
Submodular functions have been studied extensively in machine learning and data mining.
In this work, we propose a continuous DR-submodular extension for integer submodular functions.
We formulate a new probabilistic model which is defined through integer submodular functions.
arXiv Detail & Related papers (2020-06-01T22:20:45Z) - Invariant Feature Coding using Tensor Product Representation [75.62232699377877]
We prove that the group-invariant feature vector contains sufficient discriminative information when learning a linear classifier.
A novel feature model that explicitly consider group action is proposed for principal component analysis and k-means clustering.
arXiv Detail & Related papers (2019-06-05T07:15:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.