Shapley-Inspired Feature Weighting in $k$-means with No Additional Hyperparameters
- URL: http://arxiv.org/abs/2508.07952v1
- Date: Mon, 11 Aug 2025 13:07:21 GMT
- Title: Shapley-Inspired Feature Weighting in $k$-means with No Additional Hyperparameters
- Authors: Richard J. Fawley, Renato Cordeiro de Amorim,
- Abstract summary: Clustering algorithms often assume all features contribute equally to the data structure.<n>We propose SHARK (Shapley Reweighted $k$-means), a feature-weighted clustering algorithm motivated by the use of Shapley values.<n>Experiments on synthetic and real-world data sets show that SHARK consistently matches or outperforms existing methods.
- Score: 2.3940819037450987
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Clustering algorithms often assume all features contribute equally to the data structure, an assumption that usually fails in high-dimensional or noisy settings. Feature weighting methods can address this, but most require additional parameter tuning. We propose SHARK (Shapley Reweighted $k$-means), a feature-weighted clustering algorithm motivated by the use of Shapley values from cooperative game theory to quantify feature relevance, which requires no additional parameters beyond those in $k$-means. We prove that the $k$-means objective can be decomposed into a sum of per-feature Shapley values, providing an axiomatic foundation for unsupervised feature relevance and reducing Shapley computation from exponential to polynomial time. SHARK iteratively re-weights features by the inverse of their Shapley contribution, emphasising informative dimensions and down-weighting irrelevant ones. Experiments on synthetic and real-world data sets show that SHARK consistently matches or outperforms existing methods, achieving superior robustness and accuracy, particularly in scenarios where noise may be present. Software: https://github.com/rickfawley/shark.
Related papers
- Exactly Computing do-Shapley Values [16.29328788692574]
We show how to compute do-Shapley values in time linear in the number of irreducible sets $r$.<n>We also prove that non-parametric identifiability of do-Shapley values requires only the identification of interventional effects for the $d$ singleton coalitions.
arXiv Detail & Related papers (2026-02-06T21:24:56Z) - PFF-Net: Patch Feature Fitting for Point Cloud Normal Estimation [81.94096000733127]
We present a new idea of feature extraction for robust normal estimation of point clouds.<n>We use the fusion of multi-scale features from different neighborhood sizes to address the issue of selecting reasonable patch sizes for various data or geometries.<n>Our approximation strategy based on aggregating the features of multiple scales enables the model to achieve scale adaptation of varying local patches.
arXiv Detail & Related papers (2025-11-26T13:12:14Z) - shapr: Explaining Machine Learning Models with Conditional Shapley Values in R and Python [0.6562256987706128]
shapr is a versatile tool for generating Shapley value based prediction explanations.<n> shaprpy Python library brings the core capabilities of shapr to the Python ecosystem.
arXiv Detail & Related papers (2025-04-02T15:47:30Z) - Accelerated zero-order SGD under high-order smoothness and overparameterized regime [79.85163929026146]
We present a novel gradient-free algorithm to solve convex optimization problems.
Such problems are encountered in medicine, physics, and machine learning.
We provide convergence guarantees for the proposed algorithm under both types of noise.
arXiv Detail & Related papers (2024-11-21T10:26:17Z) - Improving the Weighting Strategy in KernelSHAP [0.8057006406834466]
In Explainable AI (XAI) Shapley values are a popular framework for explaining predictions made by complex machine learning models.<n>We propose a novel modification of KernelSHAP which replaces the deterministic weights with ones to reduce the variance of the resulting Shapley value approximations.<n>Our methods can reduce the required number of contribution function evaluations by $5%$ to $50%$ while preserving the same accuracy of the approximated Shapley values.
arXiv Detail & Related papers (2024-10-07T10:02:31Z) - Computational-Statistical Gaps in Gaussian Single-Index Models [77.1473134227844]
Single-Index Models are high-dimensional regression problems with planted structure.
We show that computationally efficient algorithms, both within the Statistical Query (SQ) and the Low-Degree Polynomial (LDP) framework, necessarily require $Omega(dkstar/2)$ samples.
arXiv Detail & Related papers (2024-03-08T18:50:19Z) - Fast Shapley Value Estimation: A Unified Approach [71.92014859992263]
We propose a straightforward and efficient Shapley estimator, SimSHAP, by eliminating redundant techniques.
In our analysis of existing approaches, we observe that estimators can be unified as a linear transformation of randomly summed values from feature subsets.
Our experiments validate the effectiveness of our SimSHAP, which significantly accelerates the computation of accurate Shapley values.
arXiv Detail & Related papers (2023-11-02T06:09:24Z) - DU-Shapley: A Shapley Value Proxy for Efficient Dataset Valuation [23.646508094051768]
We consider the dataset valuation problem, that is, the problem of quantifying the incremental gain.
The Shapley value is a natural tool to perform dataset valuation due to its formal axiomatic justification.
We propose a novel approximation, referred to as discrete uniform Shapley, which is expressed as an expectation under a discrete uniform distribution.
arXiv Detail & Related papers (2023-06-03T10:22:50Z) - WeightedSHAP: analyzing and improving Shapley based feature attributions [17.340091573913316]
Shapley value is a popular approach for measuring the influence of individual features.
We propose WeightedSHAP, which generalizes the Shapley value and learns which marginal contributions to focus directly from data.
On several real-world datasets, we demonstrate that the influential features identified by WeightedSHAP are better able to recapitulate the model's predictions.
arXiv Detail & Related papers (2022-09-27T14:34:07Z) - Shapley-NAS: Discovering Operation Contribution for Neural Architecture
Search [96.20505710087392]
We propose a Shapley value based method to evaluate operation contribution (Shapley-NAS) for neural architecture search.
We show that our method outperforms the state-of-the-art methods by a considerable margin with light search cost.
arXiv Detail & Related papers (2022-06-20T14:41:49Z) - Lassoed Tree Boosting [53.56229983630983]
We prove that a gradient boosted tree algorithm with early stopping faster than $n-1/4$ L2 convergence in the large nonparametric space of cadlag functions of bounded sectional variation.
Our convergence proofs are based on a novel, general theorem on early stopping with empirical loss minimizers of nested Donsker classes.
arXiv Detail & Related papers (2022-05-22T00:34:41Z) - Online Sub-Sampling for Reinforcement Learning with General Function
Approximation [111.01990889581243]
In this paper, we establish an efficient online sub-sampling framework that measures the information gain of data points collected by an RL algorithm.
For a value-based method with complexity-bounded function class, we show that the policy only needs to be updated for $proptooperatornamepolylog(K)$ times.
In contrast to existing approaches that update the policy for at least $Omega(K)$ times, our approach drastically reduces the number of optimization calls in solving for a policy.
arXiv Detail & Related papers (2021-06-14T07:36:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.