LogicRec: Recommendation with Users' Logical Requirements
- URL: http://arxiv.org/abs/2304.11722v1
- Date: Sun, 23 Apr 2023 18:46:58 GMT
- Title: LogicRec: Recommendation with Users' Logical Requirements
- Authors: Zhenwei Tang, Griffin Floto, Armin Toroghi, Shichao Pei, Xiangliang
Zhang, Scott Sanner
- Abstract summary: We formulate the problem of recommendation with users' logical requirements (LogicRec)
We propose an initial solution for LogicRec based on logical requirement retrieval and user preference retrieval.
- Score: 29.892669049903034
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Users may demand recommendations with highly personalized requirements
involving logical operations, e.g., the intersection of two requirements, where
such requirements naturally form structured logical queries on knowledge graphs
(KGs). To date, existing recommender systems lack the capability to tackle
users' complex logical requirements. In this work, we formulate the problem of
recommendation with users' logical requirements (LogicRec) and construct
benchmark datasets for LogicRec. Furthermore, we propose an initial solution
for LogicRec based on logical requirement retrieval and user preference
retrieval, where we face two challenges. First, KGs are incomplete in nature.
Therefore, there are always missing true facts, which entails that the answers
to logical requirements can not be completely found in KGs. In this case, item
selection based on the answers to logical queries is not applicable. We thus
resort to logical query embedding (LQE) to jointly infer missing facts and
retrieve items based on logical requirements. Second, answer sets are
under-exploited. Existing LQE methods can only deal with query-answer pairs,
where queries in our case are the intersected user preferences and logical
requirements. However, the logical requirements and user preferences have
different answer sets, offering us richer knowledge about the requirements and
preferences by providing requirement-item and preference-item pairs. Thus, we
design a multi-task knowledge-sharing mechanism to exploit these answer sets
collectively. Extensive experimental results demonstrate the significance of
the LogicRec task and the effectiveness of our proposed method.
Related papers
- Improving Complex Reasoning over Knowledge Graph with Logic-Aware Curriculum Tuning [89.89857766491475]
We propose a complex reasoning schema over KG upon large language models (LLMs)
We augment the arbitrary first-order logical queries via binary tree decomposition to stimulate the reasoning capability of LLMs.
Experiments across widely used datasets demonstrate that LACT has substantial improvements(brings an average +5.5% MRR score) over advanced methods.
arXiv Detail & Related papers (2024-05-02T18:12:08Z) - Prompt-fused framework for Inductive Logical Query Answering [31.736934787328156]
We propose a query-aware prompt-fused framework named Pro-QE.
We show that our model successfully handles the issue of unseen entities in logical queries.
arXiv Detail & Related papers (2024-03-19T11:30:30Z) - Neuro-Symbolic Recommendation Model based on Logic Query [16.809190067920387]
We propose a neuro-symbolic recommendation model, which transforms the user history interactions into a logic expression.
The logic expressions are then computed based on the modular logic operations of the neural network.
Experiments on three well-known datasets verified that our method performs better compared to state of the art shallow, deep, session, and reasoning models.
arXiv Detail & Related papers (2023-09-14T10:54:48Z) - Complex Query Answering on Eventuality Knowledge Graph with Implicit
Logical Constraints [48.831178420807646]
We propose a new framework to leverage neural methods to answer complex logical queries based on an EVentuality-centric KG.
Complex Eventuality Query Answering (CEQA) considers the implicit logical constraints governing the temporal order and occurrence of eventualities.
We also propose a Memory-Enhanced Query (MEQE) to significantly improve the performance of state-of-the-art neural query encoders on the CEQA task.
arXiv Detail & Related papers (2023-05-30T14:29:24Z) - Query Structure Modeling for Inductive Logical Reasoning Over Knowledge
Graphs [67.043747188954]
We propose a structure-modeled textual encoding framework for inductive logical reasoning over KGs.
It encodes linearized query structures and entities using pre-trained language models to find answers.
We conduct experiments on two inductive logical reasoning datasets and three transductive datasets.
arXiv Detail & Related papers (2023-05-23T01:25:29Z) - Discourse-Aware Graph Networks for Textual Logical Reasoning [142.0097357999134]
Passage-level logical relations represent entailment or contradiction between propositional units (e.g., a concluding sentence)
We propose logic structural-constraint modeling to solve the logical reasoning QA and introduce discourse-aware graph networks (DAGNs)
The networks first construct logic graphs leveraging in-line discourse connectives and generic logic theories, then learn logic representations by end-to-end evolving the logic relations with an edge-reasoning mechanism and updating the graph features.
arXiv Detail & Related papers (2022-07-04T14:38:49Z) - Fuzzy Logic based Logical Query Answering on Knowledge Graph [37.039516386710716]
We present FuzzQE, a fuzzy logic based query embedding framework for answering FOL queries over KGs.
FuzzyQE follows fuzzy logic to define logical operators in a principled and learning free manner.
Experiments on two benchmark datasets demonstrate that FuzzQE achieves significantly better performance in answering FOL queries.
arXiv Detail & Related papers (2021-08-05T05:54:00Z) - Logic-Driven Context Extension and Data Augmentation for Logical
Reasoning of Text [65.24325614642223]
We propose to understand logical symbols and expressions in the text to arrive at the answer.
Based on such logical information, we put forward a context extension framework and a data augmentation algorithm.
Our method achieves the state-of-the-art performance, and both logic-driven context extension framework and data augmentation algorithm can help improve the accuracy.
arXiv Detail & Related papers (2021-05-08T10:09:36Z) - Logic Embeddings for Complex Query Answering [56.25151854231117]
We propose Logic Embeddings, a new approach to embedding complex queries that uses Skolemisation to eliminate existential variables for efficient querying.
We show that Logic Embeddings are competitively fast and accurate in query answering over large, incomplete knowledge graphs, outperform on negation queries, and in particular, provide improved modeling of answer uncertainty.
arXiv Detail & Related papers (2021-02-28T07:52:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.