Divide-and-Conquer: Cold-Start Bundle Recommendation via Mixture of Diffusion Experts
- URL: http://arxiv.org/abs/2505.05035v1
- Date: Thu, 08 May 2025 08:13:44 GMT
- Title: Divide-and-Conquer: Cold-Start Bundle Recommendation via Mixture of Diffusion Experts
- Authors: Ming Li, Lin Li, Xiaohui Tao, Dong Zhang, Jimmy Xiangji Huang,
- Abstract summary: Cold-start bundle recommendation focuses on modeling new bundles with insufficient information to provide recommendations.<n>We propose a novel Mixture of Diffusion Experts (MoDiffE) framework, which employs a divide-and-conquer strategy for cold-start bundle recommendation.<n>MoDiffE significantly outperforms existing solutions in handling cold-start bundle recommendation.
- Score: 20.922606156035197
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Cold-start bundle recommendation focuses on modeling new bundles with insufficient information to provide recommendations. Advanced bundle recommendation models usually learn bundle representations from multiple views (e.g., interaction view) at both the bundle and item levels. Consequently, the cold-start problem for bundles is more challenging than that for traditional items due to the dual-level multi-view complexity. In this paper, we propose a novel Mixture of Diffusion Experts (MoDiffE) framework, which employs a divide-and-conquer strategy for cold-start bundle recommendation and follows three steps:(1) Divide: The bundle cold-start problem is divided into independent but similar sub-problems sequentially by level and view, which can be summarized as the poor representation of feature-missing bundles in prior-embedding models. (2) Conquer: Beyond prior-embedding models that fundamentally provide the embedded representations, we introduce a diffusion-based method to solve all sub-problems in a unified way, which directly generates diffusion representations using diffusion models without depending on specific features. (3) Combine: A cold-aware hierarchical Mixture of Experts (MoE) is employed to combine results of the sub-problems for final recommendations, where the two models for each view serve as experts and are adaptively fused for different bundles in a multi-layer manner. Additionally, MoDiffE adopts a multi-stage decoupled training pipeline and introduces a cold-start gating augmentation method to enable the training of gating for cold bundles. Through extensive experiments on three real-world datasets, we demonstrate that MoDiffE significantly outperforms existing solutions in handling cold-start bundle recommendation. It achieves up to a 0.1027 absolute gain in Recall@20 in cold-start scenarios and up to a 47.43\% relative improvement in all-bundle scenarios.
Related papers
- Progressive Inference-Time Annealing of Diffusion Models for Sampling from Boltzmann Densities [85.83359661628575]
We propose Progressive Inference-Time Annealing (PITA) to learn diffusion-based samplers.<n>PITA combines two complementary techniques: Annealing of the Boltzmann distribution and Diffusion smoothing.<n>It enables equilibrium sampling of N-body particle systems, Alanine Dipeptide, and tripeptides in Cartesian coordinates.
arXiv Detail & Related papers (2025-06-19T17:14:22Z) - Personalized Diffusion Model Reshapes Cold-Start Bundle Recommendation [2.115789253980982]
We propose a new approach to generate a bundle in distribution space for each user to tackle the cold-start challenge.<n>DisCo relies on a personalized Diffusion backbone, enhanced by disentangled aspects for the user's interest.<n>DisCo outperforms five comparative baselines by a large margin on three real-world datasets.
arXiv Detail & Related papers (2025-05-20T20:52:31Z) - BRIDGE: Bundle Recommendation via Instruction-Driven Generation [2.115789253980982]
BRIDGE is a novel framework for bundle recommendation.<n>It consists of two main components namely the correlation-based item clustering and the pseudo bundle generation modules.<n>Results validate the superiority of our models over state-of-the-art ranking-based methods across five benchmark datasets.
arXiv Detail & Related papers (2024-12-24T02:07:53Z) - Non-autoregressive Personalized Bundle Generation [39.83349922956341]
We propose to perform the bundle generation via non-autoregressive mechanism and design a novel encoder-decoder framework named BundleNAT.
In detail, we propose to adopt pre-training techniques and graph neural network to fully embed user-based preference and item-based compatibility information.
We then design a permutation-equivariant decoding architecture that is able to directly output the desired bundle in a one-shot manner.
arXiv Detail & Related papers (2024-06-11T03:44:17Z) - EpiDiff: Enhancing Multi-View Synthesis via Localized Epipolar-Constrained Diffusion [60.30030562932703]
EpiDiff is a localized interactive multiview diffusion model.
It generates 16 multiview images in just 12 seconds.
It surpasses previous methods in quality evaluation metrics.
arXiv Detail & Related papers (2023-12-11T05:20:52Z) - Cold-start Bundle Recommendation via Popularity-based Coalescence and
Curriculum Heating [16.00757636715368]
Existing methods for cold-start item recommendation are not readily applicable to bundles.
We propose CoHeat, an accurate approach for cold-start bundle recommendation.
CoHeat demonstrates superior performance in cold-start bundle recommendation, achieving up to 193% higher nDCG@20 compared to the best competitor.
arXiv Detail & Related papers (2023-10-05T18:02:03Z) - Matcher: Segment Anything with One Shot Using All-Purpose Feature
Matching [63.88319217738223]
We present Matcher, a novel perception paradigm that utilizes off-the-shelf vision foundation models to address various perception tasks.
Matcher demonstrates impressive generalization performance across various segmentation tasks, all without training.
Our results further showcase the open-world generality and flexibility of Matcher when applied to images in the wild.
arXiv Detail & Related papers (2023-05-22T17:59:43Z) - Unite and Conquer: Plug & Play Multi-Modal Synthesis using Diffusion
Models [54.1843419649895]
We propose a solution based on denoising diffusion probabilistic models (DDPMs)
Our motivation for choosing diffusion models over other generative models comes from the flexible internal structure of diffusion models.
Our method can unite multiple diffusion models trained on multiple sub-tasks and conquer the combined task.
arXiv Detail & Related papers (2022-12-01T18:59:55Z) - Fast Multi-view Clustering via Ensembles: Towards Scalability,
Superiority, and Simplicity [63.85428043085567]
We propose a fast multi-view clustering via ensembles (FastMICE) approach.
The concept of random view groups is presented to capture the versatile view-wise relationships.
FastMICE has almost linear time and space complexity, and is free of dataset-specific tuning.
arXiv Detail & Related papers (2022-03-22T09:51:24Z) - Privileged Graph Distillation for Cold Start Recommendation [57.918041397089254]
The cold start problem in recommender systems requires recommending to new users (items) based on attributes without any historical interaction records.
We propose a privileged graph distillation model(PGD)
Our proposed model is generally applicable to different cold start scenarios with new user, new item, or new user-new item.
arXiv Detail & Related papers (2021-05-31T14:05:27Z) - Bundle Recommendation with Graph Convolutional Networks [71.95344006365914]
Existing solutions integrate user-item interaction modeling into bundle recommendation by sharing model parameters or learning in a multi-task manner.
We propose a graph neural network model named BGCN (short for textittextBFBundle textBFGraph textBFConvolutional textBFNetwork) for bundle recommendation.
arXiv Detail & Related papers (2020-05-07T13:48:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.