TFPS: A Temporal Filtration-enhanced Positive Sample Set Construction Method for Implicit Collaborative Filtering
- URL: http://arxiv.org/abs/2602.22521v1
- Date: Thu, 26 Feb 2026 01:32:45 GMT
- Title: TFPS: A Temporal Filtration-enhanced Positive Sample Set Construction Method for Implicit Collaborative Filtering
- Authors: Jiayi Wu, Zhengyu Wu, Xunkai Li, Rong-Hua Li, Guoren Wang,
- Abstract summary: We propose a novel temporal filtration-enhanced approach to construct a high-quality positive sample set.<n>We provide theoretical insights into why TFPS can improve Recall@k and NDCG@k.
- Score: 40.89512526196666
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The negative sampling strategy can effectively train collaborative filtering (CF) recommendation models based on implicit feedback by constructing positive and negative samples. However, existing methods primarily optimize the negative sampling process while neglecting the exploration of positive samples. Some denoising recommendation methods can be applied to denoise positive samples within negative sampling strategies, but they ignore temporal information. Existing work integrates sequential information during model aggregation but neglects time interval information, hindering accurate capture of users' current preferences. To address this problem, from a data perspective, we propose a novel temporal filtration-enhanced approach to construct a high-quality positive sample set. First, we design a time decay model based on interaction time intervals, transforming the original graph into a weighted user-item bipartite graph. Then, based on predefined filtering operations, the weighted user-item bipartite graph is layered. Finally, we design a layer-enhancement strategy to construct a high-quality positive sample set for the layered subgraphs. We provide theoretical insights into why TFPS can improve Recall@k and NDCG@k, and extensive experiments on three real-world datasets demonstrate the effectiveness of the proposed method. Additionally, TFPS can be integrated with various implicit CF recommenders or negative sampling methods to enhance its performance.
Related papers
- A Topology-Aware Positive Sample Set Construction and Feature Optimization Method in Implicit Collaborative Filtering [40.89512526196666]
Negative sampling strategies are widely used in implicit collaborative filtering to address issues like data sparsity and class imbalance.<n>These strategies often introduce false negatives, hindering the model's ability to accurately learn users' latent preferences.<n>We propose a Topology-aware Positive Sample Set Construction and Feature optimization method (TPSC-FO)
arXiv Detail & Related papers (2026-02-20T15:35:48Z) - A Simple yet Effective Negative Sampling Plugin for Constructing Positive Sample Pairs in Implicit Collaborative Filtering [40.89512526196666]
PSP-NS is a negative sampling plugin for collaborative filtering.<n>It builds a user-item bipartite graph with edge weights indicating interaction confidence.<n>It generates positive sample pairs via replication-based reweighting to strengthen positive signals.<n> PSP-NS boosts Recall@30 and Precision@30 by 32.11% and 22.90% on Yelp over the strongest baselines.
arXiv Detail & Related papers (2026-02-20T13:34:43Z) - Improving LLM-based Recommendation with Self-Hard Negatives from Intermediate Layers [80.55429742713623]
ILRec is a novel preference fine-tuning framework for LLM-based recommender systems.<n>We introduce a lightweight collaborative filtering model to assign token-level rewards for negative signals.<n>Experiments on three datasets demonstrate ILRec's effectiveness in enhancing the performance of LLM-based recommender systems.
arXiv Detail & Related papers (2026-02-19T14:37:43Z) - Preference Trajectory Modeling via Flow Matching for Sequential Recommendation [50.077447974294586]
Sequential recommendation predicts each user's next item based on their historical interaction sequence.<n>FlowRec is a simple yet effective sequential recommendation framework.<n>We construct a personalized behavior-based prior distribution to replace Gaussian noise and learn a vector field to model user preference trajectories.
arXiv Detail & Related papers (2025-08-25T02:55:42Z) - Can LLM-Driven Hard Negative Sampling Empower Collaborative Filtering? Findings and Potentials [9.668242919588199]
Hard negative samples can accelerate model convergence and optimize decision boundaries.<n>This paper introduces the concept of Semantic Negative Sampling.<n>We propose a framework called HNLMRec, based on fine-tuning LLMs supervised by collaborative signals.
arXiv Detail & Related papers (2025-04-07T04:39:45Z) - Finding the Sweet Spot: Preference Data Construction for Scaling Preference Optimization [66.67988187816185]
We aim to emphscale up the number of on-policy samples via repeated random sampling to improve alignment performance.<n>Our experiments reveal that this strategy leads to a emphdecline in performance as the sample size increases.<n>We introduce a scalable preference data construction strategy that consistently enhances model performance as the sample scale increases.
arXiv Detail & Related papers (2025-02-24T04:22:57Z) - SCONE: A Novel Stochastic Sampling to Generate Contrastive Views and Hard Negative Samples for Recommendation [28.886714896469737]
Graph-based collaborative filtering (CF) has emerged as a promising approach in recommender systems.<n>Despite its achievements, graph-based CF models face challenges due to data sparsity and negative sampling.<n>In this paper, we propose a novel sampling for i) COntrastive views and ii) hard NEgative samples (SCONE) to overcome these issues.
arXiv Detail & Related papers (2024-05-01T02:27:59Z) - Dimension Independent Mixup for Hard Negative Sample in Collaborative
Filtering [36.26865960551565]
Negative sampling plays a vital role in training CF-based models with implicit feedback.
We propose Dimension Independent Mixup for Hard Negative Sampling (DINS), which is the first Area-wise sampling method for training CF-based models.
Our work contributes a new perspective, introduces Area-wise sampling, and presents DINS as a novel approach for negative sampling.
arXiv Detail & Related papers (2023-06-28T04:03:31Z) - Generating Negative Samples for Sequential Recommendation [83.60655196391855]
We propose to Generate Negative Samples (items) for Sequential Recommendation (SR)
A negative item is sampled at each time step based on the current SR model's learned user preferences toward items.
Experiments on four public datasets verify the importance of providing high-quality negative samples for SR.
arXiv Detail & Related papers (2022-08-07T05:44:13Z) - Rethinking InfoNCE: How Many Negative Samples Do You Need? [54.146208195806636]
We study how many negative samples are optimal for InfoNCE in different scenarios via a semi-quantitative theoretical framework.
We estimate the optimal negative sampling ratio using the $K$ value that maximizes the training effectiveness function.
arXiv Detail & Related papers (2021-05-27T08:38:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.