Cross-Representation Knowledge Transfer for Improved Sequential Recommendations
- URL: http://arxiv.org/abs/2602.23471v1
- Date: Thu, 26 Feb 2026 20:03:23 GMT
- Title: Cross-Representation Knowledge Transfer for Improved Sequential Recommendations
- Authors: Artur Gimranov, Viacheslav Yusupov, Elfat Sabitov, Tatyana Matveeva, Anton Lysenko, Ruslan Israfilov, Evgeny Frolov,
- Abstract summary: We present a new framework that combines transformers and graph neural networks and aligns different representations for solving next-item prediction task.<n>Our solution simultaneously encodes structural dependencies in the interaction graph and tracks their dynamic change.<n> Experimental results on a number of open datasets demonstrate that the proposed framework consistently outperforms both pure sequential and graph approaches in terms of recommendation quality.
- Score: 0.5130440339897478
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Transformer architectures, capable of capturing sequential dependencies in the history of user interactions, have become the dominant approach in sequential recommender systems. Despite their success, such models consider sequence elements in isolation, implicitly accounting for the complex relationships between them. Graph neural networks, in contrast, explicitly model these relationships through higher order interactions but are often unable to adequately capture their evolution over time, limiting their use for predicting the next interaction. To fill this gap, we present a new framework that combines transformers and graph neural networks and aligns different representations for solving next-item prediction task. Our solution simultaneously encodes structural dependencies in the interaction graph and tracks their dynamic change. Experimental results on a number of open datasets demonstrate that the proposed framework consistently outperforms both pure sequential and graph approaches in terms of recommendation quality, as well as recent methods that combine both types of signals.
Related papers
- Topology Identification and Inference over Graphs [61.06365536861156]
Topology identification and inference of processes evolving over graphs arise in timely applications involving brain, transportation, financial, power, as well as social and information networks.<n>This chapter provides an overview of graph topology identification and statistical inference methods for multidimensional data.
arXiv Detail & Related papers (2025-12-11T00:47:09Z) - Neural Network Reprogrammability: A Unified Theme on Model Reprogramming, Prompt Tuning, and Prompt Instruction [57.19302613163439]
We introduce neural network reprogrammability as a unifying framework for model adaptation.<n>We present a taxonomy that categorizes such information manipulation approaches across four key dimensions.<n>We also analyze remaining technical challenges and ethical considerations.
arXiv Detail & Related papers (2025-06-05T05:42:27Z) - End-to-End Graph-Sequential Representation Learning for Accurate Recommendations [0.7673339435080445]
This paper presents a novel multi-representational learning framework exploiting these two paradigms' synergies.
Our empirical evaluation on several datasets demonstrates that mutual training of sequential and graph components with the proposed framework significantly improves recommendations performance.
arXiv Detail & Related papers (2024-03-01T15:32:44Z) - EvolveHypergraph: Group-Aware Dynamic Relational Reasoning for
Trajectory Prediction [39.66755326557846]
We propose a group-aware relational reasoning approach (namedHypergraph) with explicit inference of the underlying dynamically evolving relational structures.
Our approach infers explainable, reasonable group-aware relations and achieves state-of-the-art performance in long-term prediction.
arXiv Detail & Related papers (2022-08-10T17:57:10Z) - Multi-Behavior Hypergraph-Enhanced Transformer for Sequential
Recommendation [33.97708796846252]
We introduce a new Multi-Behavior Hypergraph-enhanced Transformer framework (MBHT) to capture both short-term and long-term cross-type behavior dependencies.
Specifically, a multi-scale Transformer is equipped with low-rank self-attention to jointly encode behavior-aware sequential patterns from fine-grained and coarse-grained levels.
arXiv Detail & Related papers (2022-07-12T15:07:21Z) - Learning the Evolutionary and Multi-scale Graph Structure for
Multivariate Time Series Forecasting [50.901984244738806]
We show how to model the evolutionary and multi-scale interactions of time series.
In particular, we first provide a hierarchical graph structure cooperated with the dilated convolution to capture the scale-specific correlations.
A unified neural network is provided to integrate the components above to get the final prediction.
arXiv Detail & Related papers (2022-06-28T08:11:12Z) - Continuous-Time Sequential Recommendation with Temporal Graph
Collaborative Transformer [69.0621959845251]
We propose a new framework Temporal Graph Sequential Recommender (TGSRec) upon our defined continuous-time bi-partite graph.
TCT layer can simultaneously capture collaborative signals from both users and items, as well as considering temporal dynamics inside sequential patterns.
Empirical results on five datasets show that TGSRec significantly outperforms other baselines.
arXiv Detail & Related papers (2021-08-14T22:50:53Z) - Contrastive Self-supervised Sequential Recommendation with Robust
Augmentation [101.25762166231904]
Sequential Recommendationdescribes a set of techniques to model dynamic user behavior in order to predict future interactions in sequential user data.
Old and new issues remain, including data-sparsity and noisy data.
We propose Contrastive Self-Supervised Learning for sequential Recommendation (CoSeRec)
arXiv Detail & Related papers (2021-08-14T07:15:25Z) - TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning [87.38675639186405]
We propose a novel graph neural network approach, called TCL, which deals with the dynamically-evolving graph in a continuous-time fashion.
To the best of our knowledge, this is the first attempt to apply contrastive learning to representation learning on dynamic graphs.
arXiv Detail & Related papers (2021-05-17T15:33:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.