Cold-Start Recommendation towards the Era of Large Language Models (LLMs): A Comprehensive Survey and Roadmap
- URL: http://arxiv.org/abs/2501.01945v2
- Date: Thu, 16 Jan 2025 18:53:23 GMT
- Title: Cold-Start Recommendation towards the Era of Large Language Models (LLMs): A Comprehensive Survey and Roadmap
- Authors: Weizhi Zhang, Yuanchen Bei, Liangwei Yang, Henry Peng Zou, Peilin Zhou, Aiwei Liu, Yinghui Li, Hao Chen, Jianling Wang, Yu Wang, Feiran Huang, Sheng Zhou, Jiajun Bu, Allen Lin, James Caverlee, Fakhri Karray, Irwin King, Philip S. Yu,
- Abstract summary: Cold-start problem is one of the long-standing challenges in recommender systems.
Due to the diversification of internet platforms and the exponential growth of users and items, the importance of cold-start recommendation (CSR) is becoming increasingly evident.
This paper provides a comprehensive review and discussion on the roadmap, related literature, and future directions of CSR.
- Score: 78.26201062505814
- License:
- Abstract: Cold-start problem is one of the long-standing challenges in recommender systems, focusing on accurately modeling new or interaction-limited users or items to provide better recommendations. Due to the diversification of internet platforms and the exponential growth of users and items, the importance of cold-start recommendation (CSR) is becoming increasingly evident. At the same time, large language models (LLMs) have achieved tremendous success and possess strong capabilities in modeling user and item information, providing new potential for cold-start recommendations. However, the research community on CSR still lacks a comprehensive review and reflection in this field. Based on this, in this paper, we stand in the context of the era of large language models and provide a comprehensive review and discussion on the roadmap, related literature, and future directions of CSR. Specifically, we have conducted an exploration of the development path of how existing CSR utilizes information, from content features, graph relations, and domain information, to the world knowledge possessed by large language models, aiming to provide new insights for both the research and industrial communities on CSR. Related resources of cold-start recommendations are collected and continuously updated for the community in https://github.com/YuanchenBei/Awesome-Cold-Start-Recommendation.
Related papers
- Generative Large Recommendation Models: Emerging Trends in LLMs for Recommendation [85.52251362906418]
This tutorial explores two primary approaches for integrating large language models (LLMs)
It provides a comprehensive overview of generative large recommendation models, including their recent advancements, challenges, and potential research directions.
Key topics include data quality, scaling laws, user behavior mining, and efficiency in training and inference.
arXiv Detail & Related papers (2025-02-19T14:48:25Z) - All Roads Lead to Rome: Unveiling the Trajectory of Recommender Systems Across the LLM Era [63.649070507815715]
We aim to integrate recommender systems into a broader picture, and pave the way for more comprehensive solutions for future research.
We identify two evolution paths of modern recommender systems -- via list-wise recommendation and conversational recommendation.
We point out that the information effectiveness of the recommendation is increased, while the user's acquisition cost is decreased.
arXiv Detail & Related papers (2024-07-14T05:02:21Z) - Keyword-driven Retrieval-Augmented Large Language Models for Cold-start User Recommendations [5.374800961359305]
We introduce KALM4Rec, a framework to address the problem of cold-start user restaurant recommendations.
KALM4Rec operates in two main stages: candidates retrieval and LLM-based candidates re-ranking.
Our evaluation, using a Yelp restaurant dataset with user reviews from three English-speaking cities, shows that our proposed framework significantly improves recommendation quality.
arXiv Detail & Related papers (2024-05-30T02:00:03Z) - A Review of Modern Recommender Systems Using Generative Models (Gen-RecSys) [57.30228361181045]
This survey connects key advancements in recommender systems using Generative Models (Gen-RecSys)
It covers: interaction-driven generative models; the use of large language models (LLM) and textual data for natural language recommendation; and the integration of multimodal models for generating and processing images/videos in RS.
Our work highlights necessary paradigms for evaluating the impact and harm of Gen-RecSys and identifies open challenges.
arXiv Detail & Related papers (2024-03-31T06:57:57Z) - Breaking the Barrier: Utilizing Large Language Models for Industrial
Recommendation Systems through an Inferential Knowledge Graph [19.201697767418597]
We propose a novel Large Language Model based Complementary Knowledge Enhanced Recommendation System (LLM-KERec)
It extracts unified concept terms from item and user information to capture user intent transitions and adapt to new items.
Extensive experiments conducted on three industry datasets demonstrate the significant performance improvement of our model compared to existing approaches.
arXiv Detail & Related papers (2024-02-21T12:22:01Z) - Recommender Systems in the Era of Large Language Models (LLMs) [62.0129013439038]
Large Language Models (LLMs) have revolutionized the fields of Natural Language Processing (NLP) and Artificial Intelligence (AI)
We conduct a comprehensive review of LLM-empowered recommender systems from various aspects including Pre-training, Fine-tuning, and Prompting.
arXiv Detail & Related papers (2023-07-05T06:03:40Z) - Could Small Language Models Serve as Recommenders? Towards Data-centric
Cold-start Recommendations [38.91330250981614]
We present PromptRec, a simple but effective approach based on in-context learning of language models.
We propose to enhance small language models for recommender systems with a data-centric pipeline.
To the best of our knowledge, this is the first study to tackle the system cold-start recommendation problem.
arXiv Detail & Related papers (2023-06-29T18:50:12Z) - Learning to Learn a Cold-start Sequential Recommender [70.5692886883067]
The cold-start recommendation is an urgent problem in contemporary online applications.
We propose a meta-learning based cold-start sequential recommendation framework called metaCSR.
metaCSR holds the ability to learn the common patterns from regular users' behaviors.
arXiv Detail & Related papers (2021-10-18T08:11:24Z) - Using Social Media Background to Improve Cold-start Recommendation Deep
Models [8.444156978118087]
We investigate whether social media background can be used as extra contextual information to improve recommendation models.
Based on an existing deep neural network model, we proposed a method to represent temporal social media background as embeddings.
The results show that our method of fusing social media background with the existing model does generally improve recommendation performance.
arXiv Detail & Related papers (2021-06-04T04:46:29Z) - CHAMELEON: A Deep Learning Meta-Architecture for News Recommender
Systems [Phd. Thesis] [0.43512163406551996]
CHAMELEON is a Deep Learning meta-architecture designed to tackle the challenges of news recommendation.
It consists of a modular reference architecture which can be instantiated using different neural building blocks.
Experiments performed with two large datasets have shown the effectiveness of the CHAMELEON for news recommendation.
arXiv Detail & Related papers (2019-12-29T13:40:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.