Revisiting Mobility Modeling with Graph: A Graph Transformer Model for
Next Point-of-Interest Recommendation
- URL: http://arxiv.org/abs/2310.01224v1
- Date: Mon, 2 Oct 2023 14:11:16 GMT
- Title: Revisiting Mobility Modeling with Graph: A Graph Transformer Model for
Next Point-of-Interest Recommendation
- Authors: Xiaohang Xu, Toyotaro Suzumura, Jiawei Yong, Masatoshi Hanai, Chuang
Yang, Hiroki Kanezashi, Renhe Jiang, Shintaro Fukushima
- Abstract summary: Next Point-of-Interest (POI) recommendation plays a crucial role in urban mobility applications.
We propose textbfunderlineMobility textbfunderlineGraph textbfunderlineTransformer (MobGT)
MobGT enables us to fully leverage graphs to capture both the spatial and temporal features in users' mobility patterns.
- Score: 9.863982037173443
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Next Point-of-Interest (POI) recommendation plays a crucial role in urban
mobility applications. Recently, POI recommendation models based on Graph
Neural Networks (GNN) have been extensively studied and achieved, however, the
effective incorporation of both spatial and temporal information into such
GNN-based models remains challenging. Extracting distinct fine-grained features
unique to each piece of information is difficult since temporal information
often includes spatial information, as users tend to visit nearby POIs. To
address the challenge, we propose \textbf{\underline{Mob}}ility
\textbf{\underline{G}}raph \textbf{\underline{T}}ransformer (MobGT) that
enables us to fully leverage graphs to capture both the spatial and temporal
features in users' mobility patterns. MobGT combines individual spatial and
temporal graph encoders to capture unique features and global user-location
relations. Additionally, it incorporates a mobility encoder based on Graph
Transformer to extract higher-order information between POIs. To address the
long-tailed problem in spatial-temporal data, MobGT introduces a novel loss
function, Tail Loss. Experimental results demonstrate that MobGT outperforms
state-of-the-art models on various datasets and metrics, achieving 24\%
improvement on average. Our codes are available at
\url{https://github.com/Yukayo/MobGT}.
Related papers
- DA-MoE: Addressing Depth-Sensitivity in Graph-Level Analysis through Mixture of Experts [70.21017141742763]
Graph neural networks (GNNs) are gaining popularity for processing graph-structured data.
Existing methods generally use a fixed number of GNN layers to generate representations for all graphs.
We propose the depth adaptive mixture of expert (DA-MoE) method, which incorporates two main improvements to GNN.
arXiv Detail & Related papers (2024-11-05T11:46:27Z) - DyG-Mamba: Continuous State Space Modeling on Dynamic Graphs [59.434893231950205]
Dynamic graph learning aims to uncover evolutionary laws in real-world systems.
We propose DyG-Mamba, a new continuous state space model for dynamic graph learning.
We show that DyG-Mamba achieves state-of-the-art performance on most datasets.
arXiv Detail & Related papers (2024-08-13T15:21:46Z) - Multi-Scene Generalized Trajectory Global Graph Solver with Composite
Nodes for Multiple Object Tracking [61.69892497726235]
Composite Node Message Passing Network (CoNo-Link) is a framework for modeling ultra-long frames information for association.
In addition to the previous method of treating objects as nodes, the network innovatively treats object trajectories as nodes for information interaction.
Our model can learn better predictions on longer-time scales by adding composite nodes.
arXiv Detail & Related papers (2023-12-14T14:00:30Z) - Dynamic Hypergraph Structure Learning for Traffic Flow Forecasting [35.0288931087826]
Traffic flow forecasting aims to predict future traffic conditions on the basis of networks and traffic conditions in the past.
The problem is typically solved by modeling complex-temporal correlations in traffic data using far-temporal neural networks (GNNs)
Existing methods follow the paradigm of message passing that aggregates neighborhood information linearly.
In this paper, we propose a model named Dynamic Hyper Structure Learning (DyHSL) for traffic flow prediction.
arXiv Detail & Related papers (2023-09-21T12:44:55Z) - Local-Global Information Interaction Debiasing for Dynamic Scene Graph
Generation [51.92419880088668]
We propose a novel DynSGG model based on multi-task learning, DynSGG-MTL, which introduces the local interaction information and global human-action interaction information.
Long-temporal human actions supervise the model to generate multiple scene graphs that conform to the global constraints and avoid the model being unable to learn the tail predicates.
arXiv Detail & Related papers (2023-08-10T01:24:25Z) - Automated Spatio-Temporal Graph Contrastive Learning [18.245433428868775]
We develop an automated-temporal augmentation scheme with a parameterized contrastive view generator.
AutoST can adapt to the heterogeneous graph with multi-view semantics well preserved.
Experiments for three downstream-temporal mining tasks on several real-world datasets demonstrate the significant performance gain.
arXiv Detail & Related papers (2023-05-06T03:52:33Z) - Self-supervised Graph-based Point-of-interest Recommendation [66.58064122520747]
Next Point-of-Interest (POI) recommendation has become a prominent component in location-based e-commerce.
We propose a Self-supervised Graph-enhanced POI Recommender (S2GRec) for next POI recommendation.
In particular, we devise a novel Graph-enhanced Self-attentive layer to incorporate the collaborative signals from both global transition graph and local trajectory graphs.
arXiv Detail & Related papers (2022-10-22T17:29:34Z) - Long-term Spatio-temporal Forecasting via Dynamic Multiple-Graph
Attention [20.52864145999387]
Long-term tensor-temporal forecasting (LSTF) makes use of long-term dependency between spatial and temporal domains, contextual information, and inherent pattern in the data.
We propose new graph models to represent the contextual information of each node and the long-term parking revealed-temporal data dependency structure.
Our proposed approaches significantly improve the performance of existing graph neural network models in LSTF prediction tasks.
arXiv Detail & Related papers (2022-04-23T06:51:37Z) - Passenger Mobility Prediction via Representation Learning for Dynamic
Directed and Weighted Graph [31.062303389341317]
We propose a noveltemporal graph attention network namely Gallat (Graph prediction with all attention) as a solution.
In Gallat, by comprehensively incorporating those three intrinsic properties of DDW graphs, we build three attention layers to fully capture the dependencies among different regions across all historical time slots.
We evaluate proposed model on real-world datasets, and our experimental results demonstrate that Gallat outperforms the state-of-the-art approaches.
arXiv Detail & Related papers (2021-01-04T03:32:01Z) - AttnMove: History Enhanced Trajectory Recovery via Attentional Network [15.685998183691655]
We propose a novel attentional neural network-based model, named AttnMove, to densify individual trajectories by recovering unobserved locations.
We evaluate our model on two real-world datasets, and extensive results demonstrate the performance gain compared with the state-of-the-art methods.
arXiv Detail & Related papers (2021-01-03T15:45:35Z) - Robust Optimization as Data Augmentation for Large-scale Graphs [117.2376815614148]
We propose FLAG (Free Large-scale Adversarial Augmentation on Graphs), which iteratively augments node features with gradient-based adversarial perturbations during training.
FLAG is a general-purpose approach for graph data, which universally works in node classification, link prediction, and graph classification tasks.
arXiv Detail & Related papers (2020-10-19T21:51:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.