Interpretable Triplet Importance for Personalized Ranking
- URL: http://arxiv.org/abs/2407.19469v1
- Date: Sun, 28 Jul 2024 11:46:55 GMT
- Title: Interpretable Triplet Importance for Personalized Ranking
- Authors: Bowei He, Chen Ma,
- Abstract summary: We propose a shapely value-based method to measure the triplet importance in an interpretable manner.
Our model consistently outperforms the state-of-the-art methods.
- Score: 5.409302364904161
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Personalized item ranking has been a crucial component contributing to the performance of recommender systems. As a representative approach, pairwise ranking directly optimizes the ranking with user implicit feedback by constructing (\textit{user}, \textit{positive item}, \textit{negative item}) triplets. Several recent works have noticed that treating all triplets equally may hardly achieve the best effects. They assign different importance scores to negative items, user-item pairs, or triplets, respectively. However, almost all the generated importance scores are groundless and hard to interpret, thus far from trustworthy and transparent. To tackle these, we propose the \textit{Triplet Shapley} -- a Shapely value-based method to measure the triplet importance in an interpretable manner. Due to the huge number of triplets, we transform the original Shapley value calculation to the Monte Carlo (MC) approximation, where the guarantee for the approximation unbiasedness is also provided. To stabilize the MC approximation, we adopt a control covariates-based method. Finally, we utilize the triplet Shapley value to guide the resampling of important triplets for benefiting the model learning. Extensive experiments are conducted on six public datasets involving classical matrix factorization- and graph neural network-based recommendation models. Empirical results and subsequent analysis show that our model consistently outperforms the state-of-the-art methods.
Related papers
- Multi-threshold Deep Metric Learning for Facial Expression Recognition [60.26967776920412]
We present the multi-threshold deep metric learning technique, which avoids the difficult threshold validation.
We find that each threshold of the triplet loss intrinsically determines a distinctive distribution of inter-class variations.
It makes the embedding layer, which is composed of a set of slices, a more informative and discriminative feature.
arXiv Detail & Related papers (2024-06-24T08:27:31Z) - Tripod: Three Complementary Inductive Biases for Disentangled Representation Learning [52.70210390424605]
In this work, we consider endowing a neural network autoencoder with three select inductive biases from the literature.
In practice, however, naively combining existing techniques instantiating these inductive biases fails to yield significant benefits.
We propose adaptations to the three techniques that simplify the learning problem, equip key regularization terms with stabilizing invariances, and quash degenerate incentives.
The resulting model, Tripod, achieves state-of-the-art results on a suite of four image disentanglement benchmarks.
arXiv Detail & Related papers (2024-04-16T04:52:41Z) - Fast Shapley Value Estimation: A Unified Approach [71.92014859992263]
We propose a straightforward and efficient Shapley estimator, SimSHAP, by eliminating redundant techniques.
In our analysis of existing approaches, we observe that estimators can be unified as a linear transformation of randomly summed values from feature subsets.
Our experiments validate the effectiveness of our SimSHAP, which significantly accelerates the computation of accurate Shapley values.
arXiv Detail & Related papers (2023-11-02T06:09:24Z) - Efficient Shapley Values Estimation by Amortization for Text
Classification [66.7725354593271]
We develop an amortized model that directly predicts each input feature's Shapley Value without additional model evaluations.
Experimental results on two text classification datasets demonstrate that our amortized model estimates Shapley Values accurately with up to 60 times speedup.
arXiv Detail & Related papers (2023-05-31T16:19:13Z) - Walk-and-Relate: A Random-Walk-based Algorithm for Representation
Learning on Sparse Knowledge Graphs [5.444459446244819]
We propose an efficient method to augment the number of triplets to address the problem of data sparsity.
We also provide approaches to accurately and efficiently filter out informative metapaths from the possible set of metapaths.
The proposed approaches are model-agnostic, and the augmented training dataset can be used with any KG embedding approach out of the box.
arXiv Detail & Related papers (2022-09-19T05:35:23Z) - Adapting Triplet Importance of Implicit Feedback for Personalized
Recommendation [43.85549591503592]
Implicit feedback is frequently used for developing personalized recommendation services.
We propose a novel training framework named Triplet Importance Learning (TIL), which adaptively learns the importance score of training triplets.
We show that our proposed method outperforms the best existing models by 3-21% in terms of Recall@k for the top-k recommendation.
arXiv Detail & Related papers (2022-08-02T19:44:47Z) - On Modality Bias Recognition and Reduction [70.69194431713825]
We study the modality bias problem in the context of multi-modal classification.
We propose a plug-and-play loss function method, whereby the feature space for each label is adaptively learned.
Our method yields remarkable performance improvements compared with the baselines.
arXiv Detail & Related papers (2022-02-25T13:47:09Z) - Dynamic Iterative Refinement for Efficient 3D Hand Pose Estimation [87.54604263202941]
We propose a tiny deep neural network of which partial layers are iteratively exploited for refining its previous estimations.
We employ learned gating criteria to decide whether to exit from the weight-sharing loop, allowing per-sample adaptation in our model.
Our method consistently outperforms state-of-the-art 2D/3D hand pose estimation approaches in terms of both accuracy and efficiency for widely used benchmarks.
arXiv Detail & Related papers (2021-11-11T23:31:34Z) - Joint Shapley values: a measure of joint feature importance [6.169364905804678]
We introduce joint Shapley values, which directly extend the Shapley axioms.
Joint Shapley values measure a set of features' average effect on a model's prediction.
Results for games show that joint Shapley values present different insights from existing interaction indices.
arXiv Detail & Related papers (2021-07-23T17:22:37Z) - Maximizing Conditional Entropy for Batch-Mode Active Learning of
Perceptual Metrics [14.777274711706653]
We present a novel approach for batch mode active metric learning using the Maximum Entropy Principle.
We take advantage of the monotonically increasing submodular entropy function to construct an efficient greedy algorithm.
Our approach is the first batch-mode active metric learning method to define a unified score that balances informativeness and diversity for an entire batch of triplets.
arXiv Detail & Related papers (2021-02-15T06:55:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.