Entity Representation Learning Through Onsite-Offsite Graph for Pinterest Ads
- URL: http://arxiv.org/abs/2508.02609v2
- Date: Tue, 05 Aug 2025 23:18:40 GMT
- Title: Entity Representation Learning Through Onsite-Offsite Graph for Pinterest Ads
- Authors: Jiayin Jin, Zhimeng Pan, Yang Tang, Jiarui Feng, Kungang Li, Chongyuan Xiang, Jiacheng Li, Runze Su, Siping Ji, Han Sun, Ling Leng, Prathibha Deshikachar,
- Abstract summary: We develop a large-scale heterogeneous graph based on users' onsite and offsite conversion activities.<n>We introduce TransRA, a novel Knowledge Graph Embedding (KGE) model, to more efficiently integrate graph embeddings into Ads ranking models.<n>We observe a significant AUC lift in Click-Through Rate (CTR) and Conversion Rate (CVR) prediction models.
- Score: 17.339008918554715
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks (GNN) have been extensively applied to industry recommendation systems, as seen in models like GraphSage\cite{GraphSage}, TwHIM\cite{TwHIM}, LiGNN\cite{LiGNN} etc. In these works, graphs were constructed based on users' activities on the platforms, and various graph models were developed to effectively learn node embeddings. In addition to users' onsite activities, their offsite conversions are crucial for Ads models to capture their shopping interest. To better leverage offsite conversion data and explore the connection between onsite and offsite activities, we constructed a large-scale heterogeneous graph based on users' onsite ad interactions and opt-in offsite conversion activities. Furthermore, we introduced TransRA (TransR\cite{TransR} with Anchors), a novel Knowledge Graph Embedding (KGE) model, to more efficiently integrate graph embeddings into Ads ranking models. However, our Ads ranking models initially struggled to directly incorporate Knowledge Graph Embeddings (KGE), and only modest gains were observed during offline experiments. To address this challenge, we employed the Large ID Embedding Table technique and innovated an attention based KGE finetuning approach within the Ads ranking models. As a result, we observed a significant AUC lift in Click-Through Rate (CTR) and Conversion Rate (CVR) prediction models. Moreover, this framework has been deployed in Pinterest's Ads Engagement Model and contributed to $2.69\%$ CTR lift and $1.34\%$ CPC reduction. We believe the techniques presented in this paper can be leveraged by other large-scale industrial models.
Related papers
- GRAIN: Exact Graph Reconstruction from Gradients [5.697251900862886]
Federated learning claims to enable collaborative model training among multiple clients with data privacy.<n>Recent studies have shown the client privacy is still at risk due to the, so called, gradient inversion attacks.<n>We present GRAIN, the first exact gradient inversion attack on graph data in the honest-but-curious setting.
arXiv Detail & Related papers (2025-03-03T18:58:12Z) - An Automatic Graph Construction Framework based on Large Language Models for Recommendation [49.51799417575638]
We introduce AutoGraph, an automatic graph construction framework based on large language models for recommendation.<n>LLMs infer the user preference and item knowledge, which is encoded as semantic vectors.<n>Latent factors are incorporated as extra nodes to link the user/item nodes, resulting in a graph with in-depth global-view semantics.
arXiv Detail & Related papers (2024-12-24T07:51:29Z) - LightSAGE: Graph Neural Networks for Large Scale Item Retrieval in
Shopee's Advertisement Recommendation [2.1165011830664677]
We introduce our simple yet novel and impactful techniques in graph construction, modeling, and handling data skewness.
We construct high-quality item graphs by combining strong-signal user behaviors with high-precision collaborative filtering (CF) algorithm.
We then develop a new GNN architecture named LightSAGE to produce high-quality items' embeddings for vector search.
arXiv Detail & Related papers (2023-10-30T09:57:06Z) - A Study on Knowledge Graph Embeddings and Graph Neural Networks for Web
Of Things [0.0]
In the future, Orange's take on a knowledge graph in the domain of the Web Of Things (WoT) is to provide a digital representation of the physical world.
In this paper, we explore state-of-the-art knowledge graph embedding (KGE) methods to learn numerical representations of the graph entities.
We also investigate Graph neural networks (GNN) alongside KGEs and compare their performance on the same downstream tasks.
arXiv Detail & Related papers (2023-10-23T12:36:33Z) - Efficient Relation-aware Neighborhood Aggregation in Graph Neural Networks via Tensor Decomposition [4.041834517339835]
We propose a novel knowledge graph that incorporates tensor decomposition within the aggregation function of Graph Conalvolution Network (R-GCN)
Our model enhances the representation of neighboring entities by employing projection matrices of a low-rank tensor defined by relation types.
We adopt a training strategy inspired by contrastive learning to relieve the training limitation of the 1-k-k encoder method inherent in handling vast graphs.
arXiv Detail & Related papers (2022-12-11T19:07:34Z) - A Graph-Enhanced Click Model for Web Search [67.27218481132185]
We propose a novel graph-enhanced click model (GraphCM) for web search.
We exploit both intra-session and inter-session information for the sparsity and cold-start problems.
arXiv Detail & Related papers (2022-06-17T08:32:43Z) - Contributions to Representation Learning with Graph Autoencoders and Applications to Music Recommendation [2.0849578298972835]
Graph autoencoders (GAE) and variational graph autoencoders (VGAE) emerged as powerful groups of unsupervised node embedding methods.<n>At the beginning of this Ph.D. project, GAE and VGAE models were also suffering from key limitations, preventing them from being adopted in the industry.<n>We present several contributions to improve these models, with the general aim of facilitating their use to address industrial-level problems involving graph representations.
arXiv Detail & Related papers (2022-05-29T13:14:53Z) - GraphCoCo: Graph Complementary Contrastive Learning [65.89743197355722]
Graph Contrastive Learning (GCL) has shown promising performance in graph representation learning (GRL) without the supervision of manual annotations.
This paper proposes an effective graph complementary contrastive learning approach named GraphCoCo to tackle the above issue.
arXiv Detail & Related papers (2022-03-24T02:58:36Z) - Rethinking Graph Convolutional Networks in Knowledge Graph Completion [83.25075514036183]
Graph convolutional networks (GCNs) have been increasingly popular in knowledge graph completion (KGC)
In this paper, we build upon representative GCN-based KGC models and introduce variants to find which factor of GCNs is critical in KGC.
We propose a simple yet effective framework named LTE-KGE, which equips existing KGE models with linearly transformed entity embeddings.
arXiv Detail & Related papers (2022-02-08T11:36:18Z) - GraphMI: Extracting Private Graph Data from Graph Neural Networks [59.05178231559796]
We present textbfGraph textbfModel textbfInversion attack (GraphMI), which aims to extract private graph data of the training graph by inverting GNN.
Specifically, we propose a projected gradient module to tackle the discreteness of graph edges while preserving the sparsity and smoothness of graph features.
We design a graph auto-encoder module to efficiently exploit graph topology, node attributes, and target model parameters for edge inference.
arXiv Detail & Related papers (2021-06-05T07:07:52Z) - Robust Optimization as Data Augmentation for Large-scale Graphs [117.2376815614148]
We propose FLAG (Free Large-scale Adversarial Augmentation on Graphs), which iteratively augments node features with gradient-based adversarial perturbations during training.
FLAG is a general-purpose approach for graph data, which universally works in node classification, link prediction, and graph classification tasks.
arXiv Detail & Related papers (2020-10-19T21:51:47Z) - Self-Constructing Graph Convolutional Networks for Semantic Labeling [23.623276007011373]
We propose a novel architecture called the Self-Constructing Graph (SCG), which makes use of learnable latent variables to generate embeddings.
SCG can automatically obtain optimized non-local context graphs from complex-shaped objects in aerial imagery.
We demonstrate the effectiveness and flexibility of the proposed SCG on the publicly available ISPRS Vaihingen dataset.
arXiv Detail & Related papers (2020-03-15T21:55:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.