A Transformer-Based Substitute Recommendation Model Incorporating Weakly
Supervised Customer Behavior Data
- URL: http://arxiv.org/abs/2211.02533v2
- Date: Sat, 8 Apr 2023 15:27:17 GMT
- Title: A Transformer-Based Substitute Recommendation Model Incorporating Weakly
Supervised Customer Behavior Data
- Authors: Wenting Ye, Hongfei Yang, Shuai Zhao, Haoyang Fang, Xingjian Shi,
Naveen Neppalli
- Abstract summary: The proposed model has been deployed in a large-scale E-commerce website for 11 marketplaces in 6 languages.
Our proposed model is demonstrated to increase revenue by 19% based on an online A/B experiment.
- Score: 7.427088261927881
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The substitute-based recommendation is widely used in E-commerce to provide
better alternatives to customers. However, existing research typically uses the
customer behavior signals like co-view and view-but-purchase-another to capture
the substitute relationship. Despite its intuitive soundness, we find that such
an approach might ignore the functionality and characteristics of products. In
this paper, we adapt substitute recommendation into language matching problem
by taking product title description as model input to consider product
functionality. We design a new transformation method to de-noise the signals
derived from production data. In addition, we consider multilingual support
from the engineering point of view. Our proposed end-to-end transformer-based
model achieves both successes from offline and online experiments. The proposed
model has been deployed in a large-scale E-commerce website for 11 marketplaces
in 6 languages. Our proposed model is demonstrated to increase revenue by 19%
based on an online A/B experiment.
Related papers
- MMGRec: Multimodal Generative Recommendation with Transformer Model [81.61896141495144]
MMGRec aims to introduce a generative paradigm into multimodal recommendation.
We first devise a hierarchical quantization method Graph CF-RQVAE to assign Rec-ID for each item from its multimodal information.
We then train a Transformer-based recommender to generate the Rec-IDs of user-preferred items based on historical interaction sequences.
arXiv Detail & Related papers (2024-04-25T12:11:27Z) - Leveraging Large Language Models for Enhanced Product Descriptions in
eCommerce [6.318353155416729]
This paper introduces a novel methodology for automating product description generation using the LLAMA 2.0 7B language model.
We train the model on a dataset of authentic product descriptions from Walmart, one of the largest eCommerce platforms.
Our findings reveal that the system is not only scalable but also significantly reduces the human workload involved in creating product descriptions.
arXiv Detail & Related papers (2023-10-24T00:55:14Z) - Transformer Choice Net: A Transformer Neural Network for Choice
Prediction [6.6543199581017625]
We develop a neural network architecture, the Transformer Choice Net, that is suitable for predicting multiple choices.
Transformer networks turn out to be especially suitable for this task as they take into account not only the features of the customer and the items but also the context.
Our architecture shows uniformly superior out-of-sample prediction performance compared to the leading models in the literature.
arXiv Detail & Related papers (2023-10-12T20:54:10Z) - Unified Embedding Based Personalized Retrieval in Etsy Search [0.206242362470764]
We propose learning a unified embedding model incorporating graph, transformer and term-based embeddings end to end.
Our personalized retrieval model significantly improves the overall search experience, as measured by a 5.58% increase in search purchase rate and a 2.63% increase in site-wide conversion rate.
arXiv Detail & Related papers (2023-06-07T23:24:50Z) - Learning to Diversify for Product Question Generation [68.69526529887607]
We show how the T5 pre-trained Transformer encoder-decoder model can be fine-tuned for the task.
We propose a novel learning-to-diversify (LTD) fine-tuning approach that allows to enrich the language learned by the underlying Transformer model.
arXiv Detail & Related papers (2022-07-06T09:26:41Z) - Entity-Graph Enhanced Cross-Modal Pretraining for Instance-level Product
Retrieval [152.3504607706575]
This research aims to conduct weakly-supervised multi-modal instance-level product retrieval for fine-grained product categories.
We first contribute the Product1M datasets, and define two real practical instance-level retrieval tasks.
We exploit to train a more effective cross-modal model which is adaptively capable of incorporating key concept information from the multi-modal data.
arXiv Detail & Related papers (2022-06-17T15:40:45Z) - ItemSage: Learning Product Embeddings for Shopping Recommendations at
Pinterest [60.841761065439414]
At Pinterest, we build a single set of product embeddings called ItemSage to provide relevant recommendations in all shopping use cases.
This approach has led to significant improvements in engagement and conversion metrics, while reducing both infrastructure and maintenance cost.
arXiv Detail & Related papers (2022-05-24T02:28:58Z) - Product1M: Towards Weakly Supervised Instance-Level Product Retrieval
via Cross-modal Pretraining [108.86502855439774]
We investigate a more realistic setting that aims to perform weakly-supervised multi-modal instance-level product retrieval.
We contribute Product1M, one of the largest multi-modal cosmetic datasets for real-world instance-level retrieval.
We propose a novel model named Cross-modal contrAstive Product Transformer for instance-level prodUct REtrieval (CAPTURE)
arXiv Detail & Related papers (2021-07-30T12:11:24Z) - PreSizE: Predicting Size in E-Commerce using Transformers [76.33790223551074]
PreSizE is a novel deep learning framework which utilizes Transformers for accurate size prediction.
We demonstrate that PreSizE is capable of achieving superior prediction performance compared to previous state-of-the-art baselines.
As a proof of concept, we demonstrate that size predictions made by PreSizE can be effectively integrated into an existing production recommender system.
arXiv Detail & Related papers (2021-05-04T15:23:59Z) - Deep Learning-based Online Alternative Product Recommendations at Scale [0.2278231643598956]
We use both textual product information (e.g. product titles and descriptions) and customer behavior data to recommend alternative products.
Our results show that the coverage of alternative products is significantly improved in offline evaluations as well as recall and precision.
arXiv Detail & Related papers (2021-04-15T16:27:45Z) - Personalized Embedding-based e-Commerce Recommendations at eBay [3.1236273633321416]
We present an approach for generating personalized item recommendations in an e-commerce marketplace by learning to embed items and users in the same vector space.
Data ablation is incorporated into the offline model training process to improve the robustness of the production system.
arXiv Detail & Related papers (2021-02-11T17:58:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.