JointMap: Joint Query Intent Understanding For Modeling Intent
Hierarchies in E-commerce Search
- URL: http://arxiv.org/abs/2005.13783v2
- Date: Fri, 29 May 2020 21:05:35 GMT
- Title: JointMap: Joint Query Intent Understanding For Modeling Intent
Hierarchies in E-commerce Search
- Authors: Ali Ahmadvand and Surya Kallumadi and Faizan Javed and Eugene
Agichtein
- Abstract summary: An accurate understanding of a user's query intent can help improve the performance of downstream tasks such as query scoping and ranking.
In this paper, we introduce Joint Query Intent Understanding (JointMap), a deep learning model to simultaneously learn two different high-level user intent tasks.
Our results show that JointMap significantly improves both "commercial vs. non-commercial" intent prediction and product category mapping by 2.3% and 10% on average over state-of-the-art deep learning methods.
- Score: 16.31114596864235
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: An accurate understanding of a user's query intent can help improve the
performance of downstream tasks such as query scoping and ranking. In the
e-commerce domain, recent work in query understanding focuses on the query to
product-category mapping. But, a small yet significant percentage of queries
(in our website 1.5% or 33M queries in 2019) have non-commercial intent
associated with them. These intents are usually associated with non-commercial
information seeking needs such as discounts, store hours, installation guides,
etc. In this paper, we introduce Joint Query Intent Understanding (JointMap), a
deep learning model to simultaneously learn two different high-level user
intent tasks: 1) identifying a query's commercial vs. non-commercial intent,
and 2) associating a set of relevant product categories in taxonomy to a
product query. JointMap model works by leveraging the transfer bias that exists
between these two related tasks through a joint-learning process. As curating a
labeled data set for these tasks can be expensive and time-consuming, we
propose a distant supervision approach in conjunction with an active learning
model to generate high-quality training data sets. To demonstrate the
effectiveness of JointMap, we use search queries collected from a large
commercial website. Our results show that JointMap significantly improves both
"commercial vs. non-commercial" intent prediction and product category mapping
by 2.3% and 10% on average over state-of-the-art deep learning methods. Our
findings suggest a promising direction to model the intent hierarchies in an
e-commerce search engine.
Related papers
- Query-oriented Data Augmentation for Session Search [71.84678750612754]
We propose query-oriented data augmentation to enrich search logs and empower the modeling.
We generate supplemental training pairs by altering the most important part of a search context.
We develop several strategies to alter the current query, resulting in new training data with varying degrees of difficulty.
arXiv Detail & Related papers (2024-07-04T08:08:33Z) - List-aware Reranking-Truncation Joint Model for Search and
Retrieval-augmented Generation [80.12531449946655]
We propose a Reranking-Truncation joint model (GenRT) that can perform the two tasks concurrently.
GenRT integrates reranking and truncation via generative paradigm based on encoder-decoder architecture.
Our method achieves SOTA performance on both reranking and truncation tasks for web search and retrieval-augmented LLMs.
arXiv Detail & Related papers (2024-02-05T06:52:53Z) - Beyond Semantics: Learning a Behavior Augmented Relevance Model with
Self-supervised Learning [25.356999988217325]
Relevance modeling aims to locate desirable items for corresponding queries.
auxiliary query-item interactions extracted from user historical behavior data could provide hints to reveal users' search intents further.
Our model builds multi-level co-attention for distilling coarse-grained and fine-grained semantic representations from both neighbor and target views.
arXiv Detail & Related papers (2023-08-10T06:52:53Z) - Semantic Equivalence of e-Commerce Queries [6.232692545488813]
This paper introduces a framework to recognize and leverage query equivalence to enhance searcher and business outcomes.
The proposed approach addresses three key problems: mapping queries to vector representations of search intent, identifying nearest neighbor queries expressing equivalent or similar intent, and optimizing for user or business objectives.
arXiv Detail & Related papers (2023-08-07T18:40:13Z) - UniKGQA: Unified Retrieval and Reasoning for Solving Multi-hop Question
Answering Over Knowledge Graph [89.98762327725112]
Multi-hop Question Answering over Knowledge Graph(KGQA) aims to find the answer entities that are multiple hops away from the topic entities mentioned in a natural language question.
We propose UniKGQA, a novel approach for multi-hop KGQA task, by unifying retrieval and reasoning in both model architecture and parameter learning.
arXiv Detail & Related papers (2022-12-02T04:08:09Z) - Graph Enhanced BERT for Query Understanding [55.90334539898102]
query understanding plays a key role in exploring users' search intents and facilitating users to locate their most desired information.
In recent years, pre-trained language models (PLMs) have advanced various natural language processing tasks.
We propose a novel graph-enhanced pre-training framework, GE-BERT, which can leverage both query content and the query graph.
arXiv Detail & Related papers (2022-04-03T16:50:30Z) - DeepCAT: Deep Category Representation for Query Understanding in
E-commerce Search [15.041444067591007]
We propose a deep learning model, DeepCAT, which learns joint word-category representations to enhance the query understanding process.
Our results show that DeepCAT reaches a 10% improvement on em minority classes and a 7.1% improvement on em tail queries over a state-of-the-art label embedding model.
arXiv Detail & Related papers (2021-04-23T18:04:44Z) - AutoRC: Improving BERT Based Relation Classification Models via
Architecture Search [50.349407334562045]
BERT based relation classification (RC) models have achieved significant improvements over the traditional deep learning models.
No consensus can be reached on what is the optimal architecture.
We design a comprehensive search space for BERT based RC models and employ neural architecture search (NAS) method to automatically discover the design choices.
arXiv Detail & Related papers (2020-09-22T16:55:49Z) - Query Understanding via Intent Description Generation [75.64800976586771]
We propose a novel Query-to-Intent-Description (Q2ID) task for query understanding.
Unlike existing ranking tasks which leverage the query and its description to compute the relevance of documents, Q2ID is a reverse task which aims to generate a natural language intent description.
We demonstrate the effectiveness of our model by comparing with several state-of-the-art generation models on the Q2ID task.
arXiv Detail & Related papers (2020-08-25T08:56:40Z) - User Intent Inference for Web Search and Conversational Agents [3.9400263964632836]
thesis work focuses on: 1) Utterance topic and intent classification for conversational agents 2) Query intent mining and classification for Web search engines.
To address the first topic, I proposed novel models to incorporate entity information and conversation-context clues to predict both topic and intent of the user's utterances.
For the second research topic, I plan to extend the existing state of the art methods in Web search intent prediction to the e-commerce domain.
arXiv Detail & Related papers (2020-05-28T07:04:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.