Personalized Behavior-Aware Transformer for Multi-Behavior Sequential
Recommendation
- URL: http://arxiv.org/abs/2402.14473v1
- Date: Thu, 22 Feb 2024 12:03:21 GMT
- Title: Personalized Behavior-Aware Transformer for Multi-Behavior Sequential
Recommendation
- Authors: Jiajie Su, Chaochao Chen, Zibin Lin, Xi Li, Weiming Liu, and Xiaolin
Zheng
- Abstract summary: We propose a Personalized Behavior-Aware Transformer framework (PBAT) for Multi-Behavior Sequential Recommendation (MBSR) problem.
PBAT develops a personalized behavior pattern generator in the representation layer, which extracts dynamic and discriminative behavior patterns for sequential learning.
We conduct experiments on three benchmark datasets and the results demonstrate the effectiveness and interpretability of our framework.
- Score: 25.400756652696895
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Sequential Recommendation (SR) captures users' dynamic preferences by
modeling how users transit among items. However, SR models that utilize only
single type of behavior interaction data encounter performance degradation when
the sequences are short. To tackle this problem, we focus on Multi-Behavior
Sequential Recommendation (MBSR) in this paper, which aims to leverage
time-evolving heterogeneous behavioral dependencies for better exploring users'
potential intents on the target behavior. Solving MBSR is challenging. On the
one hand, users exhibit diverse multi-behavior patterns due to personal
characteristics. On the other hand, there exists comprehensive co-influence
between behavior correlations and item collaborations, the intensity of which
is deeply affected by temporal factors. To tackle these challenges, we propose
a Personalized Behavior-Aware Transformer framework (PBAT) for MBSR problem,
which models personalized patterns and multifaceted sequential collaborations
in a novel way to boost recommendation performance. First, PBAT develops a
personalized behavior pattern generator in the representation layer, which
extracts dynamic and discriminative behavior patterns for sequential learning.
Second, PBAT reforms the self-attention layer with a behavior-aware
collaboration extractor, which introduces a fused behavior-aware attention
mechanism for incorporating both behavioral and temporal impacts into
collaborative transitions. We conduct experiments on three benchmark datasets
and the results demonstrate the effectiveness and interpretability of our
framework. Our implementation code is released at
https://github.com/TiliaceaeSU/PBAT.
Related papers
- Multi-Grained Preference Enhanced Transformer for Multi-Behavior Sequential Recommendation [29.97854124851886]
Sequential recommendation aims to predict the next purchasing item according to users' dynamic preference learned from their historical user-item interactions.
Existing methods only model heterogeneous multi-behavior dependencies at behavior-level or item-level, and modelling interaction-level dependencies is still a challenge.
We propose a Multi-Grained Preference enhanced Transformer framework (M-GPT) to tackle these challenges.
arXiv Detail & Related papers (2024-11-19T02:45:17Z) - HMAR: Hierarchical Masked Attention for Multi-Behaviour Recommendation [6.946903076677841]
We introduce Hierarchical Masked Attention for multi-behavior recommendation (HMAR)
Our approach applies masked self-attention to items of the same behavior, followed by self-attention across all behaviors.
Our proposed model operates in a multi-task setting, allowing it to learn item behaviors and their associated ranking scores concurrently.
arXiv Detail & Related papers (2024-04-29T14:54:37Z) - Coarse-to-Fine Knowledge-Enhanced Multi-Interest Learning Framework for
Multi-Behavior Recommendation [52.89816309759537]
Multi-types of behaviors (e.g., clicking, adding to cart, purchasing, etc.) widely exist in most real-world recommendation scenarios.
The state-of-the-art multi-behavior models learn behavior dependencies indistinguishably with all historical interactions as input.
We propose a novel Coarse-to-fine Knowledge-enhanced Multi-interest Learning framework to learn shared and behavior-specific interests for different behaviors.
arXiv Detail & Related papers (2022-08-03T05:28:14Z) - Incorporating Heterogeneous User Behaviors and Social Influences for
Predictive Analysis [32.31161268928372]
We aim to incorporate heterogeneous user behaviors and social influences for behavior predictions.
This paper proposes a variant of Long-Short Term Memory (LSTM) which can consider context while a behavior sequence.
A residual learning-based decoder is designed to automatically construct multiple high-order cross features based on social behavior representation.
arXiv Detail & Related papers (2022-07-24T17:05:37Z) - Recommender Transformers with Behavior Pathways [50.842316273120744]
We build the Recommender Transformer (RETR) with a novel Pathway Attention mechanism.
We empirically verify the effectiveness of RETR on seven real-world datasets.
arXiv Detail & Related papers (2022-06-13T08:58:37Z) - Multi-Behavior Sequential Recommendation with Temporal Graph Transformer [66.10169268762014]
We tackle the dynamic user-item relation learning with the awareness of multi-behavior interactive patterns.
We propose a new Temporal Graph Transformer (TGT) recommendation framework to jointly capture dynamic short-term and long-range user-item interactive patterns.
arXiv Detail & Related papers (2022-06-06T15:42:54Z) - Learning Self-Modulating Attention in Continuous Time Space with
Applications to Sequential Recommendation [102.24108167002252]
We propose a novel attention network, named self-modulating attention, that models the complex and non-linearly evolving dynamic user preferences.
We empirically demonstrate the effectiveness of our method on top-N sequential recommendation tasks, and the results on three large-scale real-world datasets show that our model can achieve state-of-the-art performance.
arXiv Detail & Related papers (2022-03-30T03:54:11Z) - Multiplex Behavioral Relation Learning for Recommendation via Memory
Augmented Transformer Network [25.563806871858073]
This work proposes a Memory-Augmented Transformer Networks (MATN) to enable the recommendation with multiplex behavioral relational information.
In our MATN framework, we first develop a transformer-based multi-behavior relation encoder, to make the learned interaction representations be reflective of the cross-type behavior relations.
A memory attention network is proposed to supercharge MATN capturing the contextual signals of different types of behavior into the category-specific latent embedding space.
arXiv Detail & Related papers (2021-10-08T09:54:43Z) - Knowledge-Enhanced Hierarchical Graph Transformer Network for
Multi-Behavior Recommendation [56.12499090935242]
This work proposes a Knowledge-Enhanced Hierarchical Graph Transformer Network (KHGT) to investigate multi-typed interactive patterns between users and items in recommender systems.
KHGT is built upon a graph-structured neural architecture to capture type-specific behavior characteristics.
We show that KHGT consistently outperforms many state-of-the-art recommendation methods across various evaluation settings.
arXiv Detail & Related papers (2021-10-08T09:44:00Z) - Contrastive Self-supervised Sequential Recommendation with Robust
Augmentation [101.25762166231904]
Sequential Recommendationdescribes a set of techniques to model dynamic user behavior in order to predict future interactions in sequential user data.
Old and new issues remain, including data-sparsity and noisy data.
We propose Contrastive Self-Supervised Learning for sequential Recommendation (CoSeRec)
arXiv Detail & Related papers (2021-08-14T07:15:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.