Computing Exact Shapley Values in Polynomial Time for Product-Kernel Methods
- URL: http://arxiv.org/abs/2505.16516v1
- Date: Thu, 22 May 2025 10:53:04 GMT
- Title: Computing Exact Shapley Values in Polynomial Time for Product-Kernel Methods
- Authors: Majid Mohammadi, Siu Lun Chau, Krikamol Muandet,
- Abstract summary: PKeX-Shapley is a novel algorithm that enables the exact computation of Shapley values in time.<n>We show that PKeX-Shapley yields computational efficiency and enhances interpretability in kernel-based learning.
- Score: 11.743255602108775
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Kernel methods are widely used in machine learning due to their flexibility and expressive power. However, their black-box nature poses significant challenges to interpretability, limiting their adoption in high-stakes applications. Shapley value-based feature attribution techniques, such as SHAP and kernel-specific variants like RKHS-SHAP, offer a promising path toward explainability. Yet, computing exact Shapley values remains computationally intractable in general, motivating the development of various approximation schemes. In this work, we introduce PKeX-Shapley, a novel algorithm that utilizes the multiplicative structure of product kernels to enable the exact computation of Shapley values in polynomial time. We show that product-kernel models admit a functional decomposition that allows for a recursive formulation of Shapley values. This decomposition not only yields computational efficiency but also enhances interpretability in kernel-based learning. We also demonstrate how our framework can be generalized to explain kernel-based statistical discrepancies such as the Maximum Mean Discrepancy (MMD) and the Hilbert-Schmidt Independence Criterion (HSIC), thus offering new tools for interpretable statistical inference.
Related papers
- Scaling Probabilistic Circuits via Monarch Matrices [109.65822339230853]
Probabilistic Circuits (PCs) are tractable representations of probability distributions.<n>We propose a novel sparse and structured parameterization for the sum blocks in PCs.
arXiv Detail & Related papers (2025-06-14T07:39:15Z) - Toward Efficient Kernel-Based Solvers for Nonlinear PDEs [19.975293084297014]
This paper introduces a novel kernel learning framework toward efficiently solving nonlinear partial differential equations (PDEs)
In contrast to the state-of-the-art kernel solver that embeds differential operators within kernels, our approach eliminates these operators from the kernel.
We model the solution using a standard kernel form and differentiate the interpolant to compute the derivatives.
arXiv Detail & Related papers (2024-10-15T01:00:43Z) - Improving the Weighting Strategy in KernelSHAP [0.8057006406834466]
In Explainable AI (XAI) Shapley values are a popular framework for explaining predictions made by complex machine learning models.<n>We propose a novel modification of KernelSHAP which replaces the deterministic weights with ones to reduce the variance of the resulting Shapley value approximations.<n>Our methods can reduce the required number of contribution function evaluations by $5%$ to $50%$ while preserving the same accuracy of the approximated Shapley values.
arXiv Detail & Related papers (2024-10-07T10:02:31Z) - Fast Shapley Value Estimation: A Unified Approach [71.92014859992263]
We propose a straightforward and efficient Shapley estimator, SimSHAP, by eliminating redundant techniques.
In our analysis of existing approaches, we observe that estimators can be unified as a linear transformation of randomly summed values from feature subsets.
Our experiments validate the effectiveness of our SimSHAP, which significantly accelerates the computation of accurate Shapley values.
arXiv Detail & Related papers (2023-11-02T06:09:24Z) - Randomized Polar Codes for Anytime Distributed Machine Learning [66.46612460837147]
We present a novel distributed computing framework that is robust to slow compute nodes, and is capable of both approximate and exact computation of linear operations.
We propose a sequential decoding algorithm designed to handle real valued data while maintaining low computational complexity for recovery.
We demonstrate the potential applications of this framework in various contexts, such as large-scale matrix multiplication and black-box optimization.
arXiv Detail & Related papers (2023-09-01T18:02:04Z) - Higher-order topological kernels via quantum computation [68.8204255655161]
Topological data analysis (TDA) has emerged as a powerful tool for extracting meaningful insights from complex data.
We propose a quantum approach to defining Betti kernels, which is based on constructing Betti curves with increasing order.
arXiv Detail & Related papers (2023-07-14T14:48:52Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.<n>We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.<n>Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - Improved Random Features for Dot Product Kernels [12.321353062415701]
We make several novel contributions for improving the efficiency of random feature approximations for dot product kernels.
We show empirically that the use of complex features can significantly reduce the variances of these approximations.
We develop a data-driven optimization approach to improve random feature approximations for general dot product kernels.
arXiv Detail & Related papers (2022-01-21T14:16:56Z) - RKHS-SHAP: Shapley Values for Kernel Methods [17.52161019964009]
We propose an attribution method for kernel machines that can efficiently compute both emphInterventional and emphObservational Shapley values
We show theoretically that our method is robust with respect to local perturbations - a key yet often overlooked desideratum for interpretability.
arXiv Detail & Related papers (2021-10-18T10:35:36Z) - Fast Hierarchical Games for Image Explanations [78.16853337149871]
We present a model-agnostic explanation method for image classification based on a hierarchical extension of Shapley coefficients.
Unlike other Shapley-based explanation methods, h-Shap is scalable and can be computed without the need of approximation.
We compare our hierarchical approach with popular Shapley-based and non-Shapley-based methods on a synthetic dataset, a medical imaging scenario, and a general computer vision problem.
arXiv Detail & Related papers (2021-04-13T13:11:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.