Personalized Diffusion Model Reshapes Cold-Start Bundle Recommendation
- URL: http://arxiv.org/abs/2505.14901v1
- Date: Tue, 20 May 2025 20:52:31 GMT
- Title: Personalized Diffusion Model Reshapes Cold-Start Bundle Recommendation
- Authors: Tuan-Nghia Bui, Huy-Son Nguyen, Cam-Van Thi Nguyen, Hoang-Quynh Le, Duc-Trong Le,
- Abstract summary: We propose a new approach to generate a bundle in distribution space for each user to tackle the cold-start challenge.<n>DisCo relies on a personalized Diffusion backbone, enhanced by disentangled aspects for the user's interest.<n>DisCo outperforms five comparative baselines by a large margin on three real-world datasets.
- Score: 2.115789253980982
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Bundle recommendation aims to recommend a set of items to each user. However, the sparser interactions between users and bundles raise a big challenge, especially in cold-start scenarios. Traditional collaborative filtering methods do not work well for this kind of problem because these models rely on interactions to update the latent embedding, which is hard to work in a cold-start setting. We propose a new approach (DisCo), which relies on a personalized Diffusion backbone, enhanced by disentangled aspects for the user's interest, to generate a bundle in distribution space for each user to tackle the cold-start challenge. During the training phase, DisCo adjusts an additional objective loss term to avoid bias, a prevalent issue while using the generative model for top-$K$ recommendation purposes. Our empirical experiments show that DisCo outperforms five comparative baselines by a large margin on three real-world datasets. Thereby, this study devises a promising framework and essential viewpoints in cold-start recommendation. Our materials for reproducibility are available at: https://github.com/bt-nghia/DisCo.
Related papers
- Divide-and-Conquer: Cold-Start Bundle Recommendation via Mixture of Diffusion Experts [20.922606156035197]
Cold-start bundle recommendation focuses on modeling new bundles with insufficient information to provide recommendations.<n>We propose a novel Mixture of Diffusion Experts (MoDiffE) framework, which employs a divide-and-conquer strategy for cold-start bundle recommendation.<n>MoDiffE significantly outperforms existing solutions in handling cold-start bundle recommendation.
arXiv Detail & Related papers (2025-05-08T08:13:44Z) - Bake Two Cakes with One Oven: RL for Defusing Popularity Bias and Cold-start in Third-Party Library Recommendations [5.874782446136913]
Third-party libraries (TPLs) have become an integral part of modern software development, enhancing developer productivity and accelerating time-to-market.<n>They typically rely on collaborative filtering (CF) that exploits a two-dimensional project-library matrix (user-item in general context of recommendation) when making recommendations.<n>We propose a reinforcement learning (RL)-based approach to address popularity bias and the cold-start problem in TPL recommendation.
arXiv Detail & Related papers (2025-04-18T16:17:20Z) - Shallow AutoEncoding Recommender with Cold Start Handling via Side Features [2.8680286413498903]
We introduce an augmented EASE model that seamlessly integrates both user and item side information to address cold start issues.<n>Our method strikes a balance by effectively recommending cold start items and handling cold start users without incurring extra bias.
arXiv Detail & Related papers (2025-04-03T05:27:55Z) - Cold-start Bundle Recommendation via Popularity-based Coalescence and
Curriculum Heating [16.00757636715368]
Existing methods for cold-start item recommendation are not readily applicable to bundles.
We propose CoHeat, an accurate approach for cold-start bundle recommendation.
CoHeat demonstrates superior performance in cold-start bundle recommendation, achieving up to 193% higher nDCG@20 compared to the best competitor.
arXiv Detail & Related papers (2023-10-05T18:02:03Z) - Rethinking Missing Data: Aleatoric Uncertainty-Aware Recommendation [59.500347564280204]
We propose a new Aleatoric Uncertainty-aware Recommendation (AUR) framework.
AUR consists of a new uncertainty estimator along with a normal recommender model.
As the chance of mislabeling reflects the potential of a pair, AUR makes recommendations according to the uncertainty.
arXiv Detail & Related papers (2022-09-22T04:32:51Z) - Price DOES Matter! Modeling Price and Interest Preferences in
Session-based Recommendation [55.0391061198924]
Session-based recommendation aims to predict items that an anonymous user would like to purchase based on her short behavior sequence.
It is nontrivial to incorporate price preferences for session-based recommendation.
We propose a novel method Co-guided Heterogeneous Hypergraph Network (CoHHN) for session-based recommendation.
arXiv Detail & Related papers (2022-05-09T10:47:15Z) - Sequential Recommendation via Stochastic Self-Attention [68.52192964559829]
Transformer-based approaches embed items as vectors and use dot-product self-attention to measure the relationship between items.
We propose a novel textbfSTOchastic textbfSelf-textbfAttention(STOSA) to overcome these issues.
We devise a novel Wasserstein Self-Attention module to characterize item-item position-wise relationships in sequences.
arXiv Detail & Related papers (2022-01-16T12:38:45Z) - Learning to Learn a Cold-start Sequential Recommender [70.5692886883067]
The cold-start recommendation is an urgent problem in contemporary online applications.
We propose a meta-learning based cold-start sequential recommendation framework called metaCSR.
metaCSR holds the ability to learn the common patterns from regular users' behaviors.
arXiv Detail & Related papers (2021-10-18T08:11:24Z) - Privileged Graph Distillation for Cold Start Recommendation [57.918041397089254]
The cold start problem in recommender systems requires recommending to new users (items) based on attributes without any historical interaction records.
We propose a privileged graph distillation model(PGD)
Our proposed model is generally applicable to different cold start scenarios with new user, new item, or new user-new item.
arXiv Detail & Related papers (2021-05-31T14:05:27Z) - Cold-start Sequential Recommendation via Meta Learner [10.491428090228768]
We propose a Meta-learning-based Cold-Start Sequential Recommendation Framework, namely Mecos, to mitigate the item cold-start problem in sequential recommendation.
Mecos effectively extracts user preference from limited interactions and learns to match the target cold-start item with the potential user.
arXiv Detail & Related papers (2020-12-10T05:23:13Z) - Seamlessly Unifying Attributes and Items: Conversational Recommendation
for Cold-Start Users [111.28351584726092]
We consider the conversational recommendation for cold-start users, where a system can both ask the attributes from and recommend items to a user interactively.
Our Conversational Thompson Sampling (ConTS) model holistically solves all questions in conversational recommendation by choosing the arm with the maximal reward to play.
arXiv Detail & Related papers (2020-05-23T08:56:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.