Transition Relation Aware Self-Attention for Session-based
Recommendation
- URL: http://arxiv.org/abs/2203.06407v1
- Date: Sat, 12 Mar 2022 10:54:34 GMT
- Title: Transition Relation Aware Self-Attention for Session-based
Recommendation
- Authors: Guanghui Zhu, Haojun Hou, Jingfan Chen, Chunfeng Yuan, Yihua Huang
- Abstract summary: Session-based recommendation is a challenging problem in the real-world scenes.
Recent graph neural networks (GNNs) have emerged as the state-of-the-art methods for session-based recommendation.
We propose a novel approach for session-based recommendation, called Transition Relation Aware Self-Attention.
- Score: 11.202585147927122
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Session-based recommendation is a challenging problem in the real-world
scenes, e.g., ecommerce, short video platforms, and music platforms, which aims
to predict the next click action based on the anonymous session. Recently,
graph neural networks (GNNs) have emerged as the state-of-the-art methods for
session-based recommendation. However, we find that there exist two limitations
in these methods. One is the item transition relations are not fully exploited
since the relations are not explicitly modeled. Another is the long-range
dependencies between items can not be captured effectively due to the
limitation of GNNs. To solve the above problems, we propose a novel approach
for session-based recommendation, called Transition Relation Aware
Self-Attention (TRASA). Specifically, TRASA first converts the session to a
graph and then encodes the shortest path between items through the gated
recurrent unit as their transition relation. Then, to capture the long-range
dependencies, TRASA utilizes the self-attention mechanism to build the direct
connection between any two items without going through intermediate ones. Also,
the transition relations are incorporated explicitly when computing the
attention scores. Extensive experiments on three real-word datasets demonstrate
that TRASA outperforms the existing state-of-the-art methods consistently.
Related papers
- SelfGNN: Self-Supervised Graph Neural Networks for Sequential Recommendation [15.977789295203976]
We propose a novel framework called Self-Supervised Graph Neural Network (SelfGNN) for sequential recommendation.
The SelfGNN framework encodes short-term graphs based on time intervals and utilizes Graph Neural Networks (GNNs) to learn short-term collaborative relationships.
Our personalized self-augmented learning structure enhances model robustness by mitigating noise in short-term graphs based on long-term user interests and personal stability.
arXiv Detail & Related papers (2024-05-31T14:53:12Z) - TempGNN: Temporal Graph Neural Networks for Dynamic Session-Based
Recommendations [5.602191038593571]
Temporal Graph Neural Networks (TempGNN) is a generic framework for capturing the structural and temporal dynamics in complex item transitions.
TempGNN achieves state-of-the-art performance on two real-world e-commerce datasets.
arXiv Detail & Related papers (2023-10-20T03:13:10Z) - Context-aware Session-based Recommendation with Graph Neural Networks [6.825493772727133]
We propose CARES, a novel context-aware session-based recommendation model with graph neural networks.
We first construct a multi-relation cross-session graph to connect items according to intra- and cross-session item-level contexts.
To encode the variation of user interests, we design personalized item representations.
arXiv Detail & Related papers (2023-10-14T14:29:52Z) - Graph Decision Transformer [83.76329715043205]
Graph Decision Transformer (GDT) is a novel offline reinforcement learning approach.
GDT models the input sequence into a causal graph to capture potential dependencies between fundamentally different concepts.
Our experiments show that GDT matches or surpasses the performance of state-of-the-art offline RL methods on image-based Atari and OpenAI Gym.
arXiv Detail & Related papers (2023-03-07T09:10:34Z) - Multi-Behavior Sequential Recommendation with Temporal Graph Transformer [66.10169268762014]
We tackle the dynamic user-item relation learning with the awareness of multi-behavior interactive patterns.
We propose a new Temporal Graph Transformer (TGT) recommendation framework to jointly capture dynamic short-term and long-range user-item interactive patterns.
arXiv Detail & Related papers (2022-06-06T15:42:54Z) - Learning Iterative Robust Transformation Synchronization [71.73273007900717]
We propose to use graph neural networks (GNNs) to learn transformation synchronization.
In this work, we avoid handcrafting robust loss functions, and propose to use graph neural networks (GNNs) to learn transformation synchronization.
arXiv Detail & Related papers (2021-11-01T07:03:14Z) - Continuous-Time Sequential Recommendation with Temporal Graph
Collaborative Transformer [69.0621959845251]
We propose a new framework Temporal Graph Sequential Recommender (TGSRec) upon our defined continuous-time bi-partite graph.
TCT layer can simultaneously capture collaborative signals from both users and items, as well as considering temporal dynamics inside sequential patterns.
Empirical results on five datasets show that TGSRec significantly outperforms other baselines.
arXiv Detail & Related papers (2021-08-14T22:50:53Z) - Contrastive Self-supervised Sequential Recommendation with Robust
Augmentation [101.25762166231904]
Sequential Recommendationdescribes a set of techniques to model dynamic user behavior in order to predict future interactions in sequential user data.
Old and new issues remain, including data-sparsity and noisy data.
We propose Contrastive Self-Supervised Learning for sequential Recommendation (CoSeRec)
arXiv Detail & Related papers (2021-08-14T07:15:25Z) - Improved Representation Learning for Session-based Recommendation [0.0]
Session-based recommendation systems suggest relevant items to users by modeling user behavior and preferences using short-term anonymous sessions.
Existing methods leverage Graph Neural Networks (GNNs) that propagate and aggregate information from neighboring nodes.
We propose using a Transformer in combination with a target attentive GNN, which allows richer Representation Learning.
arXiv Detail & Related papers (2021-07-04T00:57:28Z) - DGTN: Dual-channel Graph Transition Network for Session-based
Recommendation [19.345913200934902]
We propose a novel method, namely Dual-channel Graph Transition Network (DGTN), to model item transitions within not only the target session but also the neighbor sessions.
Experiments on real-world datasets demonstrate that DGTN outperforms other state-of-the-art methods.
arXiv Detail & Related papers (2020-09-21T16:29:29Z) - TAGNN: Target Attentive Graph Neural Networks for Session-based
Recommendation [66.04457457299218]
We propose a novel target attentive graph neural network (TAGNN) model for session-based recommendation.
In TAGNN, target-aware attention adaptively activates different user interests with respect to varied target items.
The learned interest representation vector varies with different target items, greatly improving the expressiveness of the model.
arXiv Detail & Related papers (2020-05-06T14:17:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.