Denoising Long- and Short-term Interests for Sequential Recommendation
- URL: http://arxiv.org/abs/2407.14743v1
- Date: Sat, 20 Jul 2024 03:52:14 GMT
- Title: Denoising Long- and Short-term Interests for Sequential Recommendation
- Authors: Xinyu Zhang, Beibei Li, Beihong Jin,
- Abstract summary: We propose a Long- and Short-term Interest Denoising Network (LSIDN)
We employ a session-level interest extraction and evolution strategy to avoid introducing inter-session behavioral noise into long-term interest modeling.
Results of experiments on two public datasets show that LSIDN consistently outperforms state-of-the-art models.
- Score: 11.830033570949944
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: User interests can be viewed over different time scales, mainly including stable long-term preferences and changing short-term intentions, and their combination facilitates the comprehensive sequential recommendation. However, existing work that focuses on different time scales of user modeling has ignored the negative effects of different time-scale noise, which hinders capturing actual user interests and cannot be resolved by conventional sequential denoising methods. In this paper, we propose a Long- and Short-term Interest Denoising Network (LSIDN), which employs different encoders and tailored denoising strategies to extract long- and short-term interests, respectively, achieving both comprehensive and robust user modeling. Specifically, we employ a session-level interest extraction and evolution strategy to avoid introducing inter-session behavioral noise into long-term interest modeling; we also adopt contrastive learning equipped with a homogeneous exchanging augmentation to alleviate the impact of unintentional behavioral noise on short-term interest modeling. Results of experiments on two public datasets show that LSIDN consistently outperforms state-of-the-art models and achieves significant robustness.
Related papers
- HADL Framework for Noise Resilient Long-Term Time Series Forecasting [0.7810572107832383]
Long-term time series forecasting is critical in domains such as finance, economics, and energy.
The impact of temporal noise in extended lookback windows remains underexplored, often degrading model performance and computational efficiency.
We propose a novel framework that addresses these challenges by integrating the Discrete Wavelet Transform (DWT) and Discrete Cosine Transform (DCT)
Our approach demonstrates competitive robustness to noisy input, significantly reduces computational complexity, and achieves competitive or state-of-the-art forecasting performance across diverse benchmark datasets.
arXiv Detail & Related papers (2025-02-14T21:41:42Z) - Multi-granularity Interest Retrieval and Refinement Network for Long-Term User Behavior Modeling in CTR Prediction [68.90783662117936]
Click-through Rate (CTR) prediction is crucial for online personalization platforms.
Recent advancements have shown that modeling rich user behaviors can significantly improve the performance of CTR prediction.
We propose Multi-granularity Interest Retrieval and Refinement Network (MIRRN)
arXiv Detail & Related papers (2024-11-22T15:29:05Z) - TimeBridge: Non-Stationarity Matters for Long-term Time Series Forecasting [49.6208017412376]
TimeBridge is a novel framework designed to bridge the gap between non-stationarity and dependency modeling.
TimeBridge consistently achieves state-of-the-art performance in both short-term and long-term forecasting.
arXiv Detail & Related papers (2024-10-06T10:41:03Z) - Improved Noise Schedule for Diffusion Training [51.849746576387375]
We propose a novel approach to design the noise schedule for enhancing the training of diffusion models.
We empirically demonstrate the superiority of our noise schedule over the standard cosine schedule.
arXiv Detail & Related papers (2024-07-03T17:34:55Z) - SelfGNN: Self-Supervised Graph Neural Networks for Sequential Recommendation [15.977789295203976]
We propose a novel framework called Self-Supervised Graph Neural Network (SelfGNN) for sequential recommendation.
The SelfGNN framework encodes short-term graphs based on time intervals and utilizes Graph Neural Networks (GNNs) to learn short-term collaborative relationships.
Our personalized self-augmented learning structure enhances model robustness by mitigating noise in short-term graphs based on long-term user interests and personal stability.
arXiv Detail & Related papers (2024-05-31T14:53:12Z) - Denoising Time Cycle Modeling for Recommendation [19.62210742613065]
We argue that existing methods ignore the variety of temporal patterns of user behaviors.
We propose Denoising Time Cycle Modeling (DiCycle), a novel approach to denoise user behaviors.
DiCycle is able to explicitly model diverse time cycle patterns for recommendation.
arXiv Detail & Related papers (2024-02-05T04:28:08Z) - IDNP: Interest Dynamics Modeling using Generative Neural Processes for
Sequential Recommendation [40.4445022666304]
We present an textbfInterest textbfDynamics modeling framework using generative textbfNeural textbfProcesses, coined IDNP, to model user interests from a functional perspective.
Our model outperforms state-of-the-arts on various evaluation metrics.
arXiv Detail & Related papers (2022-08-09T08:33:32Z) - Learning Self-Modulating Attention in Continuous Time Space with
Applications to Sequential Recommendation [102.24108167002252]
We propose a novel attention network, named self-modulating attention, that models the complex and non-linearly evolving dynamic user preferences.
We empirically demonstrate the effectiveness of our method on top-N sequential recommendation tasks, and the results on three large-scale real-world datasets show that our model can achieve state-of-the-art performance.
arXiv Detail & Related papers (2022-03-30T03:54:11Z) - Dynamic Memory based Attention Network for Sequential Recommendation [79.5901228623551]
We propose a novel long sequential recommendation model called Dynamic Memory-based Attention Network (DMAN)
It segments the overall long behavior sequence into a series of sub-sequences, then trains the model and maintains a set of memory blocks to preserve long-term interests of users.
Based on the dynamic memory, the user's short-term and long-term interests can be explicitly extracted and combined for efficient joint recommendation.
arXiv Detail & Related papers (2021-02-18T11:08:54Z) - On Dynamic Noise Influence in Differentially Private Learning [102.6791870228147]
Private Gradient Descent (PGD) is a commonly used private learning framework, which noises based on the Differential protocol.
Recent studies show that emphdynamic privacy schedules can improve at the final iteration, yet yet theoreticals of the effectiveness of such schedules remain limited.
This paper provides comprehensive analysis of noise influence in dynamic privacy schedules to answer these critical questions.
arXiv Detail & Related papers (2021-01-19T02:04:00Z) - MRIF: Multi-resolution Interest Fusion for Recommendation [0.0]
This paper presents a multi-resolution interest fusion model (MRIF) that takes both properties of users' interests into consideration.
The proposed model is capable to capture the dynamic changes in users' interests at different temporal-ranges, and provides an effective way to combine a group of multi-resolution user interests to make predictions.
arXiv Detail & Related papers (2020-07-08T02:32:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.