FolkScope: Intention Knowledge Graph Construction for E-commerce
Commonsense Discovery
- URL: http://arxiv.org/abs/2211.08316v2
- Date: Thu, 11 May 2023 16:33:50 GMT
- Title: FolkScope: Intention Knowledge Graph Construction for E-commerce
Commonsense Discovery
- Authors: Changlong Yu, Weiqi Wang, Xin Liu, Jiaxin Bai, Yangqiu Song, Zheng Li,
Yifan Gao, Tianyu Cao, and Bing Yin
- Abstract summary: FolkScope is a framework to reveal the structure of humans' minds about purchasing items.
We generate intention assertions via e-commerce-specific prompts.
We annotate plausibility and typicality labels of sampled intentions as training data.
- Score: 36.05123266221752
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Understanding users' intentions in e-commerce platforms requires commonsense
knowledge. In this paper, we present FolkScope, an intention knowledge graph
construction framework to reveal the structure of humans' minds about
purchasing items. As commonsense knowledge is usually ineffable and not
expressed explicitly, it is challenging to perform information extraction.
Thus, we propose a new approach that leverages the generation power of large
language models~(LLMs) and human-in-the-loop annotation to semi-automatically
construct the knowledge graph. LLMs first generate intention assertions via
e-commerce-specific prompts to explain shopping behaviors, where the intention
can be an open reason or a predicate falling into one of 18 categories aligning
with ConceptNet, e.g., IsA, MadeOf, UsedFor, etc. Then we annotate plausibility
and typicality labels of sampled intentions as training data in order to
populate human judgments to all automatic generations. Last, to structurize the
assertions, we propose pattern mining and conceptualization to form more
condensed and abstract knowledge. Extensive evaluations and studies demonstrate
that our constructed knowledge graph can well model e-commerce knowledge and
have many potential applications.
Related papers
- SOK-Bench: A Situated Video Reasoning Benchmark with Aligned Open-World Knowledge [60.76719375410635]
We propose a new benchmark (SOK-Bench) consisting of 44K questions and 10K situations with instance-level annotations depicted in the videos.
The reasoning process is required to understand and apply situated knowledge and general knowledge for problem-solving.
We generate associated question-answer pairs and reasoning processes, finally followed by manual reviews for quality assurance.
arXiv Detail & Related papers (2024-05-15T21:55:31Z) - A Usage-centric Take on Intent Understanding in E-Commerce [20.648271216249977]
We focus on predicative user intents as "how a customer uses a product"
We identify two weaknesses of FolkScope, the SOTA E-Commerce Intent Graph.
They limit its ability to strongly align user intents with products having the most desirable property.
arXiv Detail & Related papers (2024-02-22T18:09:33Z) - CANDLE: Iterative Conceptualization and Instantiation Distillation from Large Language Models for Commonsense Reasoning [45.62134354858683]
CANDLE is a framework that iteratively performs conceptualization and instantiation over commonsense knowledge bases.
By applying CANDLE to ATOMIC, we construct a comprehensive knowledge base comprising six million conceptualizations and instantiated commonsense knowledge triples.
arXiv Detail & Related papers (2024-01-14T13:24:30Z) - FabKG: A Knowledge graph of Manufacturing Science domain utilizing
structured and unconventional unstructured knowledge source [1.2597961235465307]
We develop knowledge graphs based upon entity and relation data for both commercial and educational uses.
We propose a novel crowdsourcing method for KG creation by leveraging student notes.
We have created a knowledge graph containing 65000+ triples using all data sources.
arXiv Detail & Related papers (2022-05-24T02:32:04Z) - Conditional Attention Networks for Distilling Knowledge Graphs in
Recommendation [74.14009444678031]
We propose Knowledge-aware Conditional Attention Networks (KCAN) to incorporate knowledge graph into a recommender system.
We use a knowledge-aware attention propagation manner to obtain the node representation first, which captures the global semantic similarity on the user-item network and the knowledge graph.
Then, by applying a conditional attention aggregation on the subgraph, we refine the knowledge graph to obtain target-specific node representations.
arXiv Detail & Related papers (2021-11-03T09:40:43Z) - Think Before You Speak: Explicitly Generating Implicit Commonsense
Knowledge for Response Generation [45.86667254934832]
Implicit knowledge, such as common sense, is key to fluid human conversations.
In this paper, we present Think-Before-Speaking (TBS), a generative approach to first externalize implicit commonsense knowledge (think) and use this knowledge to generate responses (speak)
Empirical results show TBS models outperform end-to-end and knowledge-augmented RG baselines on most automatic metrics.
arXiv Detail & Related papers (2021-10-16T07:27:12Z) - Generated Knowledge Prompting for Commonsense Reasoning [53.88983683513114]
We propose generating knowledge statements directly from a language model with a generic prompt format.
This approach improves performance of both off-the-shelf and finetuned language models on four commonsense reasoning tasks.
Notably, we find that a model's predictions can improve when using its own generated knowledge.
arXiv Detail & Related papers (2021-10-15T21:58:03Z) - DISCOS: Bridging the Gap between Discourse Knowledge and Commonsense
Knowledge [42.08569149041291]
We propose an alternative commonsense knowledge acquisition framework DISCOS.
DISCOS populates expensive commonsense knowledge to more affordable linguistic knowledge resources.
We can acquire 3.4M ATOMIC-like inferential commonsense knowledge by populating ATOMIC on the core part of ASER.
arXiv Detail & Related papers (2021-01-01T03:30:38Z) - Common Sense or World Knowledge? Investigating Adapter-Based Knowledge
Injection into Pretrained Transformers [54.417299589288184]
We investigate models for complementing the distributional knowledge of BERT with conceptual knowledge from ConceptNet and its corresponding Open Mind Common Sense (OMCS) corpus.
Our adapter-based models substantially outperform BERT on inference tasks that require the type of conceptual knowledge explicitly present in ConceptNet and OMCS.
arXiv Detail & Related papers (2020-05-24T15:49:57Z) - ENT-DESC: Entity Description Generation by Exploring Knowledge Graph [53.03778194567752]
In practice, the input knowledge could be more than enough, since the output description may only cover the most significant knowledge.
We introduce a large-scale and challenging dataset to facilitate the study of such a practical scenario in KG-to-text.
We propose a multi-graph structure that is able to represent the original graph information more comprehensively.
arXiv Detail & Related papers (2020-04-30T14:16:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.