A Probabilistic-Logic based Commonsense Representation Framework for
Modelling Inferences with Multiple Antecedents and Varying Likelihoods
- URL: http://arxiv.org/abs/2211.16822v1
- Date: Wed, 30 Nov 2022 08:44:30 GMT
- Title: A Probabilistic-Logic based Commonsense Representation Framework for
Modelling Inferences with Multiple Antecedents and Varying Likelihoods
- Authors: Shantanu Jaiswal, Liu Yan, Dongkyu Choi, Kenneth Kwok
- Abstract summary: Commonsense knowledge-graphs (CKGs) are important resources towards building machines that can'reason' on text or environmental inputs and make inferences beyond perception.
In this work, we study how commonsense knowledge can be better represented by -- (i) utilizing a probabilistic logic representation scheme to model composite inferential knowledge and represent conceptual beliefs with varying likelihoods, and (ii) incorporating a hierarchical conceptual ontology to identify salient concept-relevant relations and organize beliefs at different conceptual levels.
- Score: 5.87677276882675
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Commonsense knowledge-graphs (CKGs) are important resources towards building
machines that can 'reason' on text or environmental inputs and make inferences
beyond perception. While current CKGs encode world knowledge for a large number
of concepts and have been effectively utilized for incorporating commonsense in
neural models, they primarily encode declarative or single-condition
inferential knowledge and assume all conceptual beliefs to have the same
likelihood. Further, these CKGs utilize a limited set of relations shared
across concepts and lack a coherent knowledge organization structure resulting
in redundancies as well as sparsity across the larger knowledge graph.
Consequently, today's CKGs, while useful for a first level of reasoning, do not
adequately capture deeper human-level commonsense inferences which can be more
nuanced and influenced by multiple contextual or situational factors.
Accordingly, in this work, we study how commonsense knowledge can be better
represented by -- (i) utilizing a probabilistic logic representation scheme to
model composite inferential knowledge and represent conceptual beliefs with
varying likelihoods, and (ii) incorporating a hierarchical conceptual ontology
to identify salient concept-relevant relations and organize beliefs at
different conceptual levels. Our resulting knowledge representation framework
can encode a wider variety of world knowledge and represent beliefs flexibly
using grounded concepts as well as free-text phrases. As a result, the
framework can be utilized as both a traditional free-text knowledge graph and a
grounded logic-based inference system more suitable for neuro-symbolic
applications. We describe how we extend the PrimeNet knowledge base with our
framework through crowd-sourcing and expert-annotation, and demonstrate its
application for more interpretable passage-based semantic parsing and question
answering.
Related papers
- Reasoning about concepts with LLMs: Inconsistencies abound [13.042591838719936]
Large language models (LLMs) often display and demonstrate significant inconsistencies in their knowledge.
In particular, we have been able to significantly enhance the performance of LLMs of various sizes with openly available weights.
arXiv Detail & Related papers (2024-05-30T15:38:54Z) - COPEN: Probing Conceptual Knowledge in Pre-trained Language Models [60.10147136876669]
Conceptual knowledge is fundamental to human cognition and knowledge bases.
Existing knowledge probing works only focus on factual knowledge of pre-trained language models (PLMs) and ignore conceptual knowledge.
We design three tasks to probe whether PLMs organize entities by conceptual similarities, learn conceptual properties, and conceptualize entities in contexts.
For the tasks, we collect and annotate 24k data instances covering 393 concepts, which is COPEN, a COnceptual knowledge Probing bENchmark.
arXiv Detail & Related papers (2022-11-08T08:18:06Z) - Understanding Substructures in Commonsense Relations in ConceptNet [8.591839265985412]
We present a methodology based on unsupervised knowledge graph representation learning and clustering to reveal and study substructures in three heavily used commonsense relations in ConceptNet.
Our results show that, despite having an 'official' definition in ConceptNet, many of these commonsense relations exhibit considerable sub-structure.
In the future, therefore, such relations could be sub-divided into other relations with more refined definitions.
arXiv Detail & Related papers (2022-10-03T22:59:07Z) - Acquiring and Modelling Abstract Commonsense Knowledge via Conceptualization [49.00409552570441]
We study the role of conceptualization in commonsense reasoning, and formulate a framework to replicate human conceptual induction.
We apply the framework to ATOMIC, a large-scale human-annotated CKG, aided by the taxonomy Probase.
arXiv Detail & Related papers (2022-06-03T12:24:49Z) - Knowledge Graph Augmented Network Towards Multiview Representation
Learning for Aspect-based Sentiment Analysis [96.53859361560505]
We propose a knowledge graph augmented network (KGAN) to incorporate external knowledge with explicitly syntactic and contextual information.
KGAN captures the sentiment feature representations from multiple perspectives, i.e., context-, syntax- and knowledge-based.
Experiments on three popular ABSA benchmarks demonstrate the effectiveness and robustness of our KGAN.
arXiv Detail & Related papers (2022-01-13T08:25:53Z) - Contextualized Knowledge-aware Attentive Neural Network: Enhancing
Answer Selection with Knowledge [77.77684299758494]
We extensively investigate approaches to enhancing the answer selection model with external knowledge from knowledge graph (KG)
First, we present a context-knowledge interaction learning framework, Knowledge-aware Neural Network (KNN), which learns the QA sentence representations by considering a tight interaction with the external knowledge from KG and the textual information.
To handle the diversity and complexity of KG information, we propose a Contextualized Knowledge-aware Attentive Neural Network (CKANN), which improves the knowledge representation learning with structure information via a customized Graph Convolutional Network (GCN) and comprehensively learns context-based and knowledge-based sentence representation via
arXiv Detail & Related papers (2021-04-12T05:52:20Z) - A Data-Driven Study of Commonsense Knowledge using the ConceptNet
Knowledge Base [8.591839265985412]
Acquiring commonsense knowledge and reasoning is recognized as an important frontier in achieving general Artificial Intelligence (AI)
In this paper, we propose and conduct a systematic study to enable a deeper understanding of commonsense knowledge by doing an empirical and structural analysis of the ConceptNet knowledge base.
Detailed experimental results on three carefully designed research questions, using state-of-the-art unsupervised graph representation learning ('embedding') and clustering techniques, reveal deep substructures in ConceptNet relations.
arXiv Detail & Related papers (2020-11-28T08:08:25Z) - Expressiveness and machine processability of Knowledge Organization
Systems (KOS): An analysis of concepts and relations [0.0]
The potential of both the expressiveness and machine processability of each Knowledge Organization System is extensively regulated by its structural rules.
Ontologies explicitly define diverse types of relations, and are by their nature machine-processable.
arXiv Detail & Related papers (2020-03-11T12:35:52Z) - Neuro-symbolic Architectures for Context Understanding [59.899606495602406]
We propose the use of hybrid AI methodology as a framework for combining the strengths of data-driven and knowledge-driven approaches.
Specifically, we inherit the concept of neuro-symbolism as a way of using knowledge-bases to guide the learning progress of deep neural networks.
arXiv Detail & Related papers (2020-03-09T15:04:07Z) - On the Role of Conceptualization in Commonsense Knowledge Graph
Construction [59.39512925793171]
Commonsense knowledge graphs (CKGs) like Atomic and ASER are substantially different from conventional KGs.
We introduce to CKG construction methods conceptualization to view entities mentioned in text as instances of specific concepts or vice versa.
Our methods can effectively identify plausible triples and expand the KG by triples of both new nodes and edges of high diversity and novelty.
arXiv Detail & Related papers (2020-03-06T14:35:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.