TransOMCS: From Linguistic Graphs to Commonsense Knowledge
- URL: http://arxiv.org/abs/2005.00206v1
- Date: Fri, 1 May 2020 04:03:58 GMT
- Title: TransOMCS: From Linguistic Graphs to Commonsense Knowledge
- Authors: Hongming Zhang, Daniel Khashabi, Yangqiu Song, Dan Roth
- Abstract summary: Conventional methods of acquiring commonsense knowledge require laborious and costly human annotations.
We explore a practical way of mining commonsense knowledge from linguistic graphs, with the goal of transferring cheap knowledge obtained with linguistic patterns into expensive commonsense knowledge.
Experimental results demonstrate the transferability of linguistic knowledge to commonsense knowledge and the effectiveness of the proposed approach in terms of quantity, novelty, and quality.
- Score: 109.36596335148091
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Commonsense knowledge acquisition is a key problem for artificial
intelligence. Conventional methods of acquiring commonsense knowledge generally
require laborious and costly human annotations, which are not feasible on a
large scale. In this paper, we explore a practical way of mining commonsense
knowledge from linguistic graphs, with the goal of transferring cheap knowledge
obtained with linguistic patterns into expensive commonsense knowledge. The
result is a conversion of ASER [Zhang et al., 2020], a large-scale selectional
preference knowledge resource, into TransOMCS, of the same representation as
ConceptNet [Liu and Singh, 2004] but two orders of magnitude larger.
Experimental results demonstrate the transferability of linguistic knowledge to
commonsense knowledge and the effectiveness of the proposed approach in terms
of quantity, novelty, and quality. TransOMCS is publicly available at:
https://github.com/HKUST-KnowComp/TransOMCS.
Related papers
- Commonsense Knowledge Transfer for Pre-trained Language Models [83.01121484432801]
We introduce commonsense knowledge transfer, a framework to transfer the commonsense knowledge stored in a neural commonsense knowledge model to a general-purpose pre-trained language model.
It first exploits general texts to form queries for extracting commonsense knowledge from the neural commonsense knowledge model.
It then refines the language model with two self-supervised objectives: commonsense mask infilling and commonsense relation prediction.
arXiv Detail & Related papers (2023-06-04T15:44:51Z) - Lifelong Learning Natural Language Processing Approach for Multilingual
Data Classification [1.3999481573773074]
We propose a lifelong learning-inspired approach, which allows for fake news detection in multiple languages.
The ability of models to generalize the knowledge acquired between the analyzed languages was also observed.
arXiv Detail & Related papers (2022-05-25T10:34:04Z) - Generated Knowledge Prompting for Commonsense Reasoning [53.88983683513114]
We propose generating knowledge statements directly from a language model with a generic prompt format.
This approach improves performance of both off-the-shelf and finetuned language models on four commonsense reasoning tasks.
Notably, we find that a model's predictions can improve when using its own generated knowledge.
arXiv Detail & Related papers (2021-10-15T21:58:03Z) - Commonsense Knowledge in Word Associations and ConceptNet [37.751909219863585]
This paper presents an in-depth comparison of two large-scale resources of general knowledge: ConcpetNet and SWOW.
We examine the structure, overlap and differences between the two graphs, as well as the extent to which they encode situational commonsense knowledge.
arXiv Detail & Related papers (2021-09-20T06:06:30Z) - DISCOS: Bridging the Gap between Discourse Knowledge and Commonsense
Knowledge [42.08569149041291]
We propose an alternative commonsense knowledge acquisition framework DISCOS.
DISCOS populates expensive commonsense knowledge to more affordable linguistic knowledge resources.
We can acquire 3.4M ATOMIC-like inferential commonsense knowledge by populating ATOMIC on the core part of ASER.
arXiv Detail & Related papers (2021-01-01T03:30:38Z) - Towards a Universal Continuous Knowledge Base [49.95342223987143]
We propose a method for building a continuous knowledge base that can store knowledge imported from multiple neural networks.
Experiments on text classification show promising results.
We import the knowledge from multiple models to the knowledge base, from which the fused knowledge is exported back to a single model.
arXiv Detail & Related papers (2020-12-25T12:27:44Z) - CoLAKE: Contextualized Language and Knowledge Embedding [81.90416952762803]
We propose the Contextualized Language and Knowledge Embedding (CoLAKE)
CoLAKE jointly learns contextualized representation for both language and knowledge with the extended objective.
We conduct experiments on knowledge-driven tasks, knowledge probing tasks, and language understanding tasks.
arXiv Detail & Related papers (2020-10-01T11:39:32Z) - Common Sense or World Knowledge? Investigating Adapter-Based Knowledge
Injection into Pretrained Transformers [54.417299589288184]
We investigate models for complementing the distributional knowledge of BERT with conceptual knowledge from ConceptNet and its corresponding Open Mind Common Sense (OMCS) corpus.
Our adapter-based models substantially outperform BERT on inference tasks that require the type of conceptual knowledge explicitly present in ConceptNet and OMCS.
arXiv Detail & Related papers (2020-05-24T15:49:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.